Teacher Truancy, 2018 Version

We hear a lot about truancy and its malign effects. For example, here.

Courtesy of the Federales, we now have data on teacher absences that may be similarly damaging to students and that certainly are harder to justify.

The 2018 version of the biennial Civil Rights Data Collection, available since October, 2020, has data on both truancy and teacher absences. Let’s look at the teachers.

The feds count full time teacher absences >10 during the school year. Eleven days are 6.1% of a 180 day school year.

Here are the 2018 counts for the Virginia school divisions.


Richmond is the gold bar at 65%. The red bars, from the left, are the peer jurisdictions Norfolk, Hampton, and Newport News. The blue bar is superimposed on one of the two divisions at the state average, 38%.


Richmond is a bit improved from the 2016 report. That year we were second from worst at 68%; this year, fifth at 65%.

The distribution of division rates looks like this:


The tall, red bar marks the Richmond datum at 65%, 1.9 standard deviations above the mean.

The Richmond school data show only Community, Ginter Park, Greene, Munford, and Alternative below the state average.


Little kids are notorious for being petri dishes for germs so we could expect the elementary schools to run high. In fact, except for that remarkable 9% number at Alternative, Richmond’s elementary schools reach both ends of the Richmond spectrum.


You might think that the very few schools where the teachers mostly come to work must have unusually heathy students and teachers or excellent principals. I’ll vote for the latter. Similarly, the very many schools where most of the teachers miss a lot of school would seem to have remarkably sick people or lousy leadership. There’s no reason to expect a plethora of sick people when some schools clearly don’t have many.

And, for sure, there is a leadership vacuum downtown and at the Board of “Education.”

In contrast, the middle schools run high while the high schools cluster toward the high-middle.



BTW: Maggie Walker is not a Richmond Public School (although the SOL pass rates there are reported at the public high schools in the students’ home school zones) and the numbers there are in a different universe from even the best of the Richmond high schools.


In any case, it is clear that the RPS “leaders” downtown need to be directed to work that is better suited to their talents.

We might expect the SOL performance to decline with increasing teacher truancy. Here are the reading data for the Richmond elementary schools:


Note: Virginia’s economically disadvantaged (here “ED”) students pass the SOLs at rates some 17 to 22 points lower than their more affluent peers (here “Not ED”), depending on the subject. Thus the school and division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. This punishes the schools and divisions with large ED enrollments. We’ll avoid that issue here by looking at the rates for both groups.

For the Richmond elementary schools, the ED reading rate drops by some 2.2% for a 10% increase in the teacher truancy but the R-squared value tells us that the teacher absences only explain about 12% of the SOL variance. As to the Not ED students, any effect is smaller (ca. 9% for a 10% increase in teacher absences) and very nearly uncorrelated.

The math data give similar results.


The division data paint a similar picture.



In contrast to the national data, these Virginia numbers do not show a strong relationship between SOL performance and teacher truancy. They do suggest that the absent teachers may not be much more effective at teaching than the substitutes. And, in any case, the data spotlight a massive, and expensive, management failure, both in Richmond and statewide.

Commonwealth of Lake Woebegon

If you wanted to boost the pass rates of the SOLs, you’d have three choices (aside from the one perfected at Carver): Improve teaching, make the tests easier, or relax the scoring.

On the 2019 revision of the math tests, the Board of “Education” chose the last option: They adopted cut scores in five of six cases that were less than the level necessary to retain the same level of rigor as the earlier tests. The results were predictable (and, of course, fed the false notion that student performance was improving).


The Board now has jiggered the English tests to the same end. The recommendation (teacher-driven; no pretense here of objectivity) was for every cut score to be lower (easier) than the level necessary to maintain the rigor of the tests.


The Board rejected the Superintendent’s slightly higher recommendations and adopted the committee’s numbers (video at 1:44:55; minutes are not yet available). This grade inflation will have the happy result of making the Board and the Superintendents and the English teachers and the students all look better, all without anybody having to break a sweat.

It will also make it impossible to measure the effect of the coronavirus on English performance.

This is not an anomaly, but rather part of an ongoing campaign to camouflage the fading performance of the Virginia public school system. However, unfortunately for the self-serving mendacity of the “education” establishment, the NAEP data for fourth grade reading


and eighth grade reading


give away the game.

Your tax dollars at “work.”

Richmond: The Excellent, the OK, and the Awful

While we wait to see how far the Board of Education will punt on the 2021 SOL testing, let’s look in some detail at the 2019 performance of Richmond’s schools (there having been no testing in 2020).

But first, some important background: Statewide, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”) by some 17 to 22 points, depending on the subject. Thus the school and division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. This punishes the schools and divisions with large ED enrollments. We’ll avoid that issue here by looking at the rates for both groups.

To start, here are the ED pass rate distributions of Virginia and Richmond schools on the reading tests.


The blue bars are the counts of Virginia schools with the indicated (rounded) pass rates. The red bars, with open middles so the state data can show through, are Richmond; the Richmond scale is on the right-hand axis.

The state data here (and even more in the next chart) are skewed toward the low end. That renders the usual measures of a distribution, the mean and standard deviation, less useful. The measure reported here is the median.

The two Richmond schools that aced the reading tests are Open and Community. The next entry, at 86%, is the other selective school, Franklin. The best of the mainstream schools is Marshall at 71%. The only other school to beat the state median was Cary at 68%. The eight Richmond schools in the cellar are, from the bottom, Alternative, Fairfield Court, MLK, Carver, Woodville, Chimborazo, and Mason.

The Not ED data portray another disaster.


Community and Open again aced the tests. They are followed by Munford, Hill, Franklin, Fox, Alternative and, barely above the state median, Patrick Henry. At the other end, the largest failures are, from the left, Greene, MLK, Boushall, Woodville, Elkhardt-Thompson, and Henderson. Fairfield Court would surely be in that latter list but for the suppression rule (<10 Not ED students).

Turning to the math tests, the Richmond pass rates are even less encouraging:


The schools that beat the state median are, from the top, Open, Community, Cary, Franklin, and Redd. At the other end, the basement dwellers are, from the bottom, Alternative, MLK, Fairfield Court, Carver, Boushall, Wythe, and Henderson.


As to Not ED, Open, Community, Munford, Fox, and Ginter Park beat the state median. Boushall, MLK, Wythe, Greene, Henderson, Elkhardt-Thompson, and Blackwell all scored below 50%.

These data emphasize the huge spreads between Richmond’s best and worst schools as well as the stunning under-performance of flocks of Richmond’s students.

For the record, here are the data, sorted by decreasing averages of the four data points. The “#DIV/0!” entries are for cases where the student count was zero or, more likely, suppressed by VDOE because it was <10.


Region 7 Addendum

We have seen that the divisions in SW Virginia (“Region 7” in the VDOE system) formed their own organization, the Comprehensive Instructional Program (“CIP”), that brought nice improvements in student performance.

While we wait to see whether the Board of “Education” will punt on the 2021 SOL testing, I’ve been looking over the 2019 data (there being no tests in 2020). The data for Region 7 paint a lovely picture.

You may recall that, since undertaking the CIP, Region 7 has seen major improvements in the pass rates of its economically disadvantaged (“ED”) students.


They accomplished this with a large and increasing ED population.


To put the 2019 results in a more nuanced context, let’s start with the school average reading pass rates for the Not ED students.


The blue bars are the counts of Virginia schools with the indicated 2019 pass rates of Not ED students (rounded to the nearest whole nubers). Thus, one school (Fairfax County Adult High) turned in a 13% pass rate(!) and 102 schools had 88% rates. The red-bounded bars are Region 7, left open to allow the state numbers to show through. The Region 7 scale is on the right vertical axis. The lowest school there turned in a 69 while 11 schools had 91% rates. (Excel reports for “multiple items” when you tell it to report data for more than one division; please read that term as “Region 7.”)

The usual statistical measures, mean and standard deviation, are of limited use with skewed distributions so I show the medians here. Of course, as a distribution approaches “normal,” the median approach the mean. In any case, these are medians of the school averages, not division medians.

If you think the Not ED pass rates for Region 7 schools are a pleasant bit of news, take a look at the ED numbers:


Here, the Region 7 median is ten points higher than the state. Or you might prefer to ignore those stats and just look at the lovely picture.

The math data similarly testify to the success of the CIP.



It is instructive to compare the (manifestly sensible) techniques used by the CIP with the resolutely ineffective bureaucratic nonsense imposed by the “education” establishment.

The CIP:

  • Identify the good teachers,
  • Share their materials and techniques,
  • Measure what works,
  • Focus on core skills,
  • Set high expectations,
  • Bond with the students, and
  • Use the feckless VDOE only for what
    it actually can do well: crunch numbers.

The state, here the Petersburg Corrective Action Plan (for a division that the state has been attempting to repair, without success, since 2004):

I think it is past time to redirect the education bureaucracy to what it can do well, crunch numbers, and give the rest of its budget to the CIP.

More Money and Less Education in Richmond

On the subject of spending for schools, the VDOE Web Site has 2019 data for division income by source and fall enrollments of both economically disadvantaged (“ED”) students and their more affluent peers (“Not ED”).

The division income totals per student, plotted against the % of ED enrollment, looks like this:


Richmond is the enlarged, yellow point. The red points are, from the left, the peer cities Newport News, Hampton, and Norfolk.

The fitted line suggests that per student division income increases by $68 for a 10% increase in the ED percentage but the R-squared value tells us the variables are uncorrelated.

Richmond is in 15th place in this find-the-money derby.


In terms of local funding, Richmond again is above average but down in 27th place.


The R-squared value of 7% suggests a slight correlation between local funding and % ED, but in the decreasing direction.

The other funding sources present a more interesting picture.


State funding shows a modest correlation, R-squared = 22%, while the federal data exhibit a more robust R-squared of 42%. Funding in both categories increases with increasing % ED.  Richmond is well below the fitted curve for state funding, with part of that gap closed by Uncle Sam.

The Sales Tax funding is essentially flat.

Looking again at just the Big Winners:


Here we see that larger than average local taxes support the effort in every case while the State accounts for some of the excess in Highland and Sussex and the federales do so in Surry, Sussex, and Richmond.

Of course, once that money comes in the divisions spend it.  We earlier saw the expenditure data juxtaposed with Richmond’s lousy performance:


If only Richmond would stop whinging about money and start educating its schoolchildren.

Learning and Poverty

An earlier post showed the absence of a correlation between division day school expenditures and either the average SOL pass rate of economically disadvantaged (“ED”) students (mostly those who qualify for the free lunch program) or those of their more affluent peers (“Not ED”).  As well, the post dwelled on the remarkable progress of ED students under the Comprehensive Instructional Program, a bottom-up initiative that started in VDOE’s Region 7 (southwest Virginia).

Regarding that post, the estimable Dick Hall-Sizemore asks whether the relative numbers of ED and Not ED students make a difference as to SOL performance.

The short answer is NO as to reading and slightly as to math, but with the (modest) effect in a place where you might not expect it.

For the longer answer, here are the data:


The average reading rate for ED students increases 1.8% going from 0% to 100% ED students in the tested group. However, the R-squared of 0.1% tells us the two variables are not correlated.

The Not ED performance, in contrast, decreases by 1.4% for a 10% increase in the ED population. The R-squared value indicates that the ED population explains 11% of the variance in those pass rates.  Thus, to a modest extent, it seems that increasing the proportion of less affluent students is related to lowered performance of the more affluent students.

If you have a testable explanation for this result, please do share it with me.

Richmond is the enlarged points with yellow fill. The red fills are the peer cities, Hampton and Newport News on the left (Hampton above) and Norfolk on the right.

Notice particularly the several divisions with both ED percentages and reading pass rates higher than Richmond’s.


Indeed, the sole exception among that dozen divisions is the Petersburg Not ED datum.

The math data tell a similar story, but with a steeper decline in the Not ED pass rates (3.3% for a 10% ED population increase) and a more substantial R-squared value, 24%.


ED population also fails to explain the better performance of the Region 7 schools that started the (remarkably successful) Comprehensive Instructional Program.  Region 7 has considerably more than the average population of ED students.


Note: There is an anomaly in these numbers. Washington County goes from 62.3% ED in 2018 to 99.1% in 2019. This looks like a data error.  I have asked VDOE about it; they have not replied (presumably too busy with programs that don’t work). The 2019 average with Washington Co. excluded would be 56.0%. For now, it would be wise to ignore the Washington Co. numbers and to discount that 61% some.

As to the details:



Note: The points at 99% are Washington County.

It is instructive to compare the manifestly effective CIP approach in Region 7 with the resolutely ineffective VBOE “Plan” for dealing with the ongoing failure of the Petersburg system.

STOP! Please go back and read all of each of those lists so you can fully appreciate the fecklessness of the Board’s approach.

A modest proposal: Let’s expand the CIP statewide and and shrink the Board of Education’s function to what they can usefully do: statistics and webinars.

Money Don’t Buy You Learning

It’s December. The Generous Assembly is about to return and the demands for more education funding [see Executive Summary at p.4] resound throughout the Commonwealth.

The data would suggest that these demands are misplaced.

VDOE won’t post the 2020 expenditure data until sometime this Spring and there were no 2020 SOLs, so we’ll use the 2019 expenditure and SOL data. The expenditure numbers below are the those for “day school operation” (the sum of Administration, Instruction, Attendance and Health Services, Pupil Transportation, and O&M spending). Student counts are the year-end average daily membership.

One wrinkle: Statewide, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”) by some 17 to 22 points, depending on the subject. Thus the division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. We’ll avoid that issue here by looking at the rates for both groups.

Here, then, are the division average reading pass rates for the two groups plotted v. the division day school expenditure per student.


Richmond is the enlarged points with yellow fill. The red-filled points are, from the left, the peer cities Hampton, Norfolk, and Newport News.

The fitted lines suggest that performance of the Not ED students increases slightly with expenditure (about 2% per $10,000) while the ED scores decrease (ca. 4% per $10,000). The R-squared values, however, tell us there is only a minuscule correlation between the pass rates and the expenditures.

We can get a clearer view of the data for Richmond and the peer cities by expanding the axis to hide the Very Big Spenders.



Need I say it? Richmond is spending well above average money and obtaining lousy results for both groups of students.

The math data tell the same story: More Money doesn’t correlate with more learning; Richmond spends a lot and gets awful pass rates.



As a pleasant contrast to that bad news, the (locally created and run) Comprehensive Instructional Plan has produced remarkable gains in the Southwest:


They’ve told us how they achieved this:

  • Identify the good teachers,
  • Share their materials and techniques,
  • Measure what works,
  • Focus on core skills,
  • Set high expectations,
  • Bond with the students, and
  • Use the feckless VDOE only for what it actually can do well: crunch numbers.

While the Generous Assembly is in town perhaps they will consider taking the school improvement budget that is wasted at VDOE and giving it to the CIP, where they know how to get results.

More Graduates, Less Learning

The estimable Jim Bacon notices the increased graduation rates this year and wonders how much of the increase reflects the waivers issued by the Superintendent.  We have some of the underlying data.

On May 26, the Governor issued Executive Order 51 that provides, in part:

Authorization for the heads of executive branch agencies, on behalf of their regulatory boards as appropriate, and with the concurrence of their Cabinet Secretary, to waive any state requirement or regulation. . . .  All waivers issued by agencies shall be posted on their websites.

The “guidance” on the VDOE Web Site provides, also in part:

The following graduation requirements are waived based on authority granted to the Superintendent of Public Instruction per Executive Order Fifty-One (2020):

  • Students currently enrolled in a course for which they need a verified credit in order to graduate;

  • Students who have previously been awarded standard credit, but have not earned the associated verified credit;

  • Students who have not completed the student-selected test;

  • Students who are currently enrolled in or have previously completed a course leading to a CTE credential necessary for a Standard Diploma but have not yet earned the credential;

  • Students who have not completed a United States and Virginia history course*;

  • Students who have not completed a fine or performing arts or career and technical education course*;

  • Students in the second of sequential courses*;

  • Students who have not completed an economics and personal finance course*.

VDOE does not set out any direct measure of the effect of these waivers but the history of the cohort graduation rates provides some clues.

But first, what rates shall we measure?

The VDOE announcement brags that the 4-year, “on time,” 2020 cohort graduation rate rose to 92.3% from 91.5% last year, “despite the closure of schools due to COVID-19 in March.”

At the threshold, we might notice that the “on time” graduation rate is manipulated by the inclusion of Modified Standard diplomas that are issued to students who “are unlikely to meet the credit requirements for a Standard Diploma.” The more honest numbers are the “federal” rates (counting only the advanced and standard diplomas), 89.9% this year and 88.7% last year.

Note added later on 10/3: Oops! In fact, there were fewer than ten Modified Standard diplomas this year. The Board already put its thumb on the scale as to that diploma; “The Modified Standard Diploma will not be an option for students with disabilities who enter the ninth grade for the first time beginning in 2013-2014. Credit accommodations allow students with disabilities who previously would have pursued a Modified Standard Diploma to earn a Standard Diploma.” The current jiggering is set out here: “Special education students and limited English students who have plans in place that allow them more time to graduate will be counted as graduates or non-graduates when they earn a diploma or otherwise exit high school.” Translated, that means that Special Ed and LEP students who don’t graduate in four years get counted in the next (or later) year’s cohort. Talk about win-win for boosting the graduation numbers.

Here is the recent history of the federal diploma rates.


The state rate was flat between ‘18 and ‘19. Extrapolating that suggests that the 1.2 point improvement in ‘20 was entirely artificial. The fitted line shows an annual increase of ca. 0.3% per year as of 2019, suggesting that 3/4 of the 1.2% increase this year came from the waived requirements. Slice that any way you like, most of the increase reflects administrative fiat, not academic accomplishment.

Thus, of the 98,481 students in the cohort, it looks like somewhere between about 885 and 1180 received bogus diplomas.

Closer to home, the Richmond data suggest that the waived requirements artificially reversed Richmond’s plunging graduation rates.

You and I get to wonder what kind of lying bureaucrat would brag that the artificial increase served “to ensure that students were not held back because being unable to take a Standards of Learning test or complete a required course” while ignoring the wholesale grant of degrees to students who would not have earned them.

The distribution of the 2020 division rates skews toward the low end with Richmond, the yellow datum, leading the skew.


Data for Covington, Lexington, and Highland Co. are absent. I’ve colored the 90% datum light blue to indicate the location of the state average.

All that said, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”).  The division average SOL pass rates for the two groups differ by about 20 points.

Similarly, the (inflated) state average graduation rates for Not ED and ED students differ this year by just 10%, so the division averages again reflect affluence as well as performance. To deal with that, let’s break out the data for those two groups.



Here we see that the state average increase this year came mostly from the ED population. This makes sense, given that the ED group graduates at a lower rate and would be helped more by a waiver of graduation requirements.

As to Richmond, this year’s jump came entirely from the ED average while the graduation rate of Not ED students actually dropped.

(If you can explain that Not ED drop in the face of waived requirements, or those other fluctuations in the Richmond rates, please relieve my confusion with an email to John{at}calaf{dot}org.)

When we separate out the ED and Not ED data, Richmond’s place among the divisions is more complicated.



Richmond is the gold bar in the first graph and is one of the four represented by the gold bar in the second. The light blue bars again are at the counts at the state averages, here 92 and 82.

As to Not ED students, Richmond’s 59% graduation rate (!) is third from the bottom. (Mercy! What can be going on in Covington, 12%, and Hopewell, 37%?)  As to ED students, Richmond’s 74% is too low, but not in the cellar.

These data do not begin to explain why Richmond’s Not ED graduation rate is so appallingly low, nor why it declined (yet again) this year.

Finally, here are the graduation rate changes from 2019 to 2020, sorted by decreasing ED differences.


It is far from obvious why any of those graduation rates should decrease in light of the waivers, much less how Goochland landed a 24.76% decrease in the ED rate.

“Equity” and the Performance of Virginia’s Black and White Students

The estimable Jim Bacon suggests that the Northam administration’s emphasis on “equity” and “restorative-justice” is keeping disorderly students in the classroom to the detriment of the other students. As well, he posits that behavior problems are more common among black students so the effect should be larger in divisions with larger black populations.

VDOE has some data that might speak to those issues.

Elementary and middle school students mostly take the same tests at the same time. High school, not so much. So let’s look at the data for the elementary and middle school grades.

First, the disorder. The Safe Schools Information Resource goes back to 2015. For grades 3-8, the statewide counts there of individual offenders as a percentage of their ethnic population are:


In absolute terms, the 2019 rate of individual offenders for black students, 16.4% is 9.2% higher than the all students rate of 7.3% while the white students’ rate is 2.8% below.

All three rates increased from 2015: All students by 0.7%, black by 0.8%, and white by 0.6%. That is consistent with the Bacon hypothesis and it confounds any notion that the government’s actions are reducing disorder in our schools. 

The picture in Richmond is less definitive.


Contrary to the state averages, all three of Richmond’s offender rates decreased after 2016, all students by 6.1%, black by 4.4%, and white by 2.2%

But then there is Fairfax County.


(Oops! Corrected error in the x-axis, 6/26/20. Hat Tip WayneS.)

Overall, the Fairfax rate rose by 3.3% (61.5% of the 2015 rate); the black rate increased by 1.9% (35.3% of the ‘15 rate); the white rate, 0.73% (59.9% of the 2015 starting point).

It is tempting to assign the huge increases in the offender counts in Fairfax to the emphasis there on “restorative-justice (pdf)” and “equity.” The smaller increases statewide offer a lesser but still enticing temptation. No telling what those Richmond decreases mean but with a 2019 offender rate of 17.5 %, 2.4 times the state average of 7.3%, the only sensible inference is that’s it’s past time to move one’s family to one of the nearby counties.

For sure, something awful is going on in Richmond schools and something is getting worse statewide, much worse in Fairfax.

To look for the possibility of an “equity” effect on performance, let’s turn to the SOL pass rates.

The big two SOL tests are reading and math. First, the state average pass rates on the reading tests.


The large drop in 2013 came from new, tougher tests. Those new tests also exacerbated the black/white performance gap.

To the point here, these data show slightly improving pass rates for both black and white students in the first years after the new tests but faltering rates in 2019. During the same period, the gap between black and white students improved (decreased) by 4.0 points, a significant amount, albeit virtually all the improvement came during the recovery, such as it was, from the new tests.

On the math tests, the big drop came with new, tougher tests in 2012. More recently, the pass rates were faltering until they enjoyed a bounce in 2019 from newer, easier tests.  Also in the recent period the black/white gap was worsening slightly but improved a bit with the easier tests.

(There’s a template for making the entire school system look better: Just water down the tests every year.)


The black/white difference improved by 3.4 points from 2013 to 2019 with, again, most of the improvement coming before the “equity” campaign.

If the Governor has had any effect on pass rates, it’s not obvious here. So let’s turn to a couple of the more interesting divisions.

Richmond has one of the largest percentages of black students in the state.


(2019 data. Race group names abbreviated; Am. Indian and Pacific Islander groups both < 0.5% and omitted to simplify the graph).

Or, in terms of the black/white ratio,


The Richmond reading data tell a sorry story.


There’s no recent improvement in the Richmond rates; indeed, the (already appalling) rates for the black students have declined some in the last three years while the black/white gap worsened  by 9.5 points. That gap now stands at 39.3, 188% of the state average gap.

On the math tests, the Richmond gap has deteriorated consistently since 2014, landing at 35.9% in 2019.


Turning to Fairfax, a County that has been a leader in the restorative-justice (pdf) movement. 

The racial distribution in the Fairfax schools is different from the state average and quite different from Richmond.



Both black and white students in Fairfax consistently beat the state averages for their groups but the black/white gaps closely track the state numbers.



These data suggest that the “equity” movement in the Virginia’s schools has not improved pass rates on the elementary and middle school reading and math tests and has generally failed to halt ongoing deterioration of the black/white rate gaps.

The recent, general decline of these pass rates, except under the new, easier math tests, is consistent with increasing disorder in the classrooms but does not establish a causal relationship.

On these data, Bacon is batting at least .750:

  1. Keeping disorderly students in the classroom: Number of offenders is rising;
  2. To the detriment of the other students: Pass rates are falling;
  3. Behavior problems are more common among black students: Astronomically;
  4. The effect should be larger in divisions with larger black populations: Not so, at least on this very small sample of divisions.

Because of the very small number of divisions in the sample here, I need to look at the data for all the divisions. Stay tuned.

More on 2019 Graduation Rates

Having just looked in some detail at the dropout data, let’s turn to the graduation rates. These are 2019, 4-year cohort, on-time graduation data.

But first, some background:

According to the U.S. Census Bureau’s American Community Survey, the population of U.S. 18- through 24-year-olds not enrolled in school and without a high school diploma or General Educational Development, or GED, credential was 16.4 percent in 2009. Among 16- to 24-year-olds who were incarcerated during 2006-07, only 1 in 1,000 had a bachelor’s degree, while 6.3 percent were high school dropouts who didn’t have a GED. (Sum, Khatiwada, McLaughlin & Palma, 2009).

As to Virginia, here are the division average, on-time diploma rates for economically disadvantaged students (“ED”) plotted v. the rates of their more affluent peers (“Not ED”) (data are percentages).


Richmond’s disastrous performance aside, these data share with the dropout data a curious inversion: Given that ED students generally underperform their Not ED peers on the SOL tests (for example, see here), we might expect that the ED graduation rates would be lower than the Not ED. The state averages, ED 87.2 & Not ED 93.9 are consistent with that. But Richmond shows a higher ED than not ED rate, 73.9 v. 65.9. And the fitted line, notwithstanding the relatively low R-squared value, suggests that on average Not ED rates below 84.6 are associated with higher ED than Not ED graduation rates.


Indeed, all the divisions above the gray line on the graph below exhibit that anomaly.


Trophy Offer: As with the dropout data, I’ll give a #2 lead pencil as a prize to anybody who can offer a (testable) hypothesis that explains this phenomenon. But before you heat your brain up on this, take a look at the post that will follow this one in a day or two.

Turning to the data by school, we see the details of Richmond’s win in the race to the bottom.


As well, the Richmond schools, other than Marshall and the three selective schools, show anomalously high ED graduation rates.


Of course, Richmond’s graduation rates are a direct reflection of the dropout rates.

image image

The red diamonds are the peers, from the left Newport News, Hampton, and Norfolk.