Failed But Promoted

The data tables in the Superintendent’s Annual Report mostly appear about the same time as the dandelions.  A reader, however, points out that a few tables emerge earlier.  Of interest among those early bloomers, Table 3 reports the 2017-18 fall membership by division along with the number of students NOT repeating the same grade from 2016-17. 

The difference between those two numbers gives the number who DID repeat (i.e. who had NOT been promoted).  Let’s juxtapose those data with the division failure rates (i.e., 1.0 minus the pass rate) on the SOL tests. 

First, the reading tests:

image

Overall, 22.6% of Virginia students did not pass the 2017 reading SOL tests while a mere 1.7% of the 2018 fall enrollment were students who had not been promoted.

The positive slope is consistent, for the most part, with more students held back in the divisions with lower pass rates but the least squares fit to the data shows only a modest correlation (R-squared = 14%).

Richmond is the gold square.  The peer jurisdictions, from the top, are Norfolk, Hampton, and Newport News.  Charles City is green; Lynchburg, blue.

The math data paint much the same picture.

image

Average failure rate, 22.4%.  Very slightly better correlation.

We might wonder whether the lower social promotion rates in Norfolk (more than three times as many students held back as the average) and Hampton (almost 2.5 times as many) explain their much better performances vis-a-vis Richmond.  To make that case, however, we would have to explain away Newport News.

There is plenty of room to argue about the wisdom of social promotion.  There is no room to argue with the conclusion that Virginia schools employ it, wholesale.  Indeed, given that the SOL tests “establish minimum expectations,” there is room to conclude that “wholesale” understates the reality.


Myopic Attaboy

VDOE emitted a press release this morning announcing that Alexandria and Chesterfield have been placed on the 9th Annual AP District Honor Roll.  The release quotes the Superintendent:

“Earning a place on the AP District Honor Roll reflects a school division’s commitment to equity and to encouraging all students to reach higher,” Superintendent of Public Instruction James Lane said. “I congratulate the leaders and educators of Alexandria and Chesterfield County for reaching out and identifying students with the potential for success in Advanced Placement courses, and for providing the opportunities and supports new AP students need to succeed.”

All that hoorah overlooks, especially in Alexandria, the wholesale failure of too many children who are in greatest need of some good teaching.

Here, to start, are the 2018 SOL reading pass rates by school for the Alexandria division, juxtaposed with the statewide data, plotted vs. the percentage of economically disadvantaged (“ED”) students taking the tests.

image

The blue diamonds are the average pass rates by Virginia school of students who are not economically disadvantaged (“Not ED”).  The gold triangles are the pass rates of the ED students in those schools. 

Note: Data are for schools where VDOE does not suppress pass rate data for either the ED or Not ED group.

The green circles represent the school average pass rates of Alexandria’s students who are Not ED.  The green line, fitted to those data, shows performance mostly exceeding the state average, the blue dashed line.

To the point here, the yellow circles are the school average pass rates for Alexandria’s ED students.  Note particularly the points at 36%, representing Mount Vernon Elementary where 64% of the ED students failed to pass the reading tests, and 38% at George Mason Elementary, 62% failing.  Note also the yellow fitted line that discloses the wholesale underperformance of ED students in Alexandria schools relative to the state, the gold dashed line.

The math data show a similar pattern:

image

Again, Alexandria schools produce an ED performance that barely breaks 50%, along with two schools below 40% (Mount Vernon again, at 31% this time, and William Ramsay Elementary, 36%).

Our Board of “Education” continues to prefer celebrating success, even if it has to invent that success, rather than dealing with the unacceptable performance of too many of our schools.

That said, Chesterfield paints a prettier picture, nearly mirroring the state averages, albeit with an uncomfortable collection of pass rates <60%:

image

image

And, of course, the press release is quite silent as to what the Board of “Education” is doing to improve the ~20% statwide gap between the Not ED and the ED pass rates.

Economic Disadvantage and Schools’ Reading Performance

In terms of the state and division average SOL pass rates, economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”).

For example, here are the 2018 state averages on the reading tests.

This makes the SOL average an unfair tool for measuring academic progress because of the invisible disadvantage the averages place on the schools with larger populations of ED students.

To start a dive underneath the overall averages, here are the 2018 school average reading pass rates for ED and Not ED students, plotted vs. the percentage of ED students in the school.

The blue diamonds are the school averages for the students who are Not ED.  The fitted line comports with the visual picture:  Pass rates of the Not ED students decline as the percentage of ED students increases.

The 24% R-squared value tells us that %ED predicts 24% of the variance in the pass rates.  Of course, that does not say that the population of ED students causes the score change, just that the two variables are related, albeit at some distance.

The orange triangles are the average pass rates of the ED students in those schools.  As we would expect, those numbers mostly are lower than the Not ED rates.  The fitted line shows a more modest negative slope and the R-squared value, 9%, tells us that the correlation between the ED pass rate and the ED percentage in the school is much less robust than the Not ED correlation.

These data comport with the earlier conclusion (again without telling us the underlying factors or mechanism): At the school level, averages for ED students are generally below the averages for Not ED students.

The data also suggest something else: Pass rates of the Not ED students correlate negatively, to a modest degree, with the percentage of ED students in the school.  But the Not ED students, not so much.

This gets even more interesting if we overlay the data for the schools in a particular division.  Let’s start with the Big Guy, Fairfax.

The green dots are the Fairfax Not ED school averages; yellow, the ED.

The green fitted line for the Not ED students lies nearly atop the all schools line with nearly the same R-squared value.

However, in terms of the school averages the Fairfax ED students underperform the state average; as well, the slope is more negative (-2.7% for a 10% increase in ED population vs. -1.8% for the Not ED students).  Moreover, the R-squared values for the two groups are nearly equal and are large enough to suggest a modest correlation.

Why the FFax ED students underperform and why that underperformance increases with the ED population are questions the FFax school board really should address.  For sure, they have a problem there.

Well, that was fun, but distant.  How about Richmond?

Richmond has three schools with low ED populations; the Not Ed students in those schools have OK pass rates but the ED students are a mixed bag.  For the most part, both the ED and Not ED groups perform poorly in the more numerous, high-ED schools, which pulls the fitted lines down.

Indeed, a 10% increase in the ED population is associated with a -5.5% change in the Not ED pass rate and -2.9% in the ED rate.  As well, the R-squared for the Not ED students is reaching toward a robust correlation.  Said in English: On average, Richmond schools with more ED students have lower pass rates, while the pass rates for the Not ED students tend to be lowered more than those for the ED students.

The lowest %ED school, Munford (16%), has a 92% Not ED pass rate and a better than average 77% ED rate.  Richmond Alternative, at 21% ED, has a respectable 87% Not ED rate (especially “respectable” given that it is Richmond’s academic boot camp) but an awful 36% rate for its ED students.  Fox, at 22% ED, has a fine Not Ed pass rate, 95%, but a subpar 63% ED rate.

The yellow point at 48% ED, 100% pass rate, is Community High, a select school showing select school results. That yellow point sits atop a 100% green point.

The other Richmond schools whose Not ED students averaged >90% are Franklin (95%) and Cary and TJ (92% each).

The point at 89% ED shows a 29.6% ED pass rate, third worst in the state for ED students; it is the worst of our awful middle schools (and second worst overall in the state), MLK.

The four yellow points at 100% ED illustrate a minor anomaly in these data: The VDOE suppression rules blocked the head counts for the Not Ed students at Greene, Fairfield Court, Southampton, and Woodville, so (1) there are no corresponding Not ED points, thus (2) those four ED points are a few % farther to the right than they should be.  Count that as a bonus. If those points were in the right places, the fitted line would be even steeper.

These data say, quite clearly, that Richmond has a problem , especially in its (many) schools with large ED populations.  (The estimable Jim Bacon would suggest that problem, at least in part, is student behavior.)

Richmond will continue to look bad at least until it figures out what is wrong here.  On the SOL averages, they look even worse than their (awful) performance would merit because of the large ED populations.  And, to the point of real life and not sterile numbers, Richmond’s schools are failing, miserably, in their purpose of delivering an education “to enable each student to develop the skills that are necessary for success in school, preparation for life, and reaching their full potential.”  That failure is most stark in the case of the students who are already disadvantaged in economic terms.

For the record, here is the Richmond list.  The #DIV/0! and #N/A entries reflect suppressed data.

There are more insights to be had from these data.  Let’s start with the peer cities.

In Hampton, notice the relatively narrow range of ED percentages, the lower than average pass rates, and the steep fitted lines with non-trivial R-squared values.

Newport News data tell the same story but with much steeper slopes and stronger correlations.

Also Norfolk.

Whew!  That looks like a magnified version of Richmond’s ED issues.

Turning to the ‘burbs, these data rat out Hanover, which performs at the state average for its Not ED students but not so well with ED students, even at the lower ED populations.  Hanover gets good numbers on the statewide list of average pass rates, however, because of its low ED percentages.

Then we have Chesterfield, performing at average for both groups.

And Henrico, with notable correlations and major underperformance by both groups in the higher %ED schools.

Finally, Lynchburg, named for a relative of my paternal grandmother and, to the point here, a place where I have a reader.

Notice the milder correlations here.  Also the outstanding Not ED (95%) and not so outstanding ED pass rate (59%) at the high-ED school (Dearington Elementary).  Also the lowest ED pass rate, 47%, contrasting with an 83% Not ED rate (at Linkhorn Middle).

Bottom line: OK Not ED pass rates in L’Burg; not so good ED.

Next up: Math.

Economic Disadvantage v. More Teachers

We have seen that, on the 2017 data, division average SOL pass rates are not correlated with the numbers of teachers per student.  There is a flaw in that analysis, however:  Economically disadvantaged (“ED”) students score lower on the SOL, on average, than Not ED students.  Thus, divisions with larger populations of ED students tend have lowered overall pass rates.

The VDOE database can break out the pass rates for both ED students and their more affluent peers, so let’s take a more nuanced look.

To start, here are the 2017 division average reading pass rates for both ED and Not ED students graphed vs. the number of teachers per thousand students.  (The latest available teacher numbers are from 2017).

The slopes of the least squares fitted lines might suggest that more teachers in the division correlate with decreased pass rates of the Not ED students and slightly increased rates of the ED students.  But the R-squared values tell us that the pass rates in both datasets are essentially uncorrelated with the teacher/student ratios.

In short, these data reach the same result as the overall pass rate data: Divisions with more teachers per student do not, on average, have better pass rates.

The largest exception to that generality, out there with 189 teachers per thousand and a 97.1% Not ED pass rate (and a much better than average ED rate), is Highland County.  The ED superstar is West Point with 135 teachers per thousand and an ED pass rate of 85.2, followed by Wise (115, 84.4) and Bath (159, 82.7).

To single out some other divisions: Richmond is the yellow squares.  The peer cities are the red triangles, from the left Newport News, Hampton, and Norfolk.  Charles City is green; Lynchburg, blue.

Just looking at the graph, Richmond’s ED rate is farther from the fitted line than its Non ED.  Indeed, Excel tells us that the Richmond Not ED average is 11.8 points below the all divisions average.  That is, Richmond’s Not ED students passed at a rate 11.8% lower than the state average for Not Ed students.  The Richmond ED rate is 17.1 points below the state average for ED students.  That is, Richmond’s Not ED performance is poor; it’s ED performance is half again worse.

Aside from the question why their ED scores are so low, these data falsify Richmond’s frequent excuse that it must deal with a large ED population: Richmond does a lousy job with its Not ED students, and an even worse one with the ED students.  Whatever the cause of Richmond’s awful SOL performance, it infects the entire student population.

Next up, writing:


Pretty much the same story there as the reading data (but notice how Highland reverts back toward the mean).  Richmond’s ED rate is lowest in the state.

The graphs for the other three subjects are chapters in the same book:

excuse

There is one interesting trend here: Richmond’s ED underperformance, relative to the Not ED students, is much smaller in science and in history/SS than in English, and is somewhat less in math.  To quantify that, here are the Richmond differences from the division means for each group in each subject:

These data do not separate out any of the factors that can affect student performance, other than (roughly grouped) economic status; they do comport with the notion that Richmond has a lot of work to do, especially with its ED students.

To the bottom line: These data are consistent with the conclusion in the recent study that “the evidence suggests at best a small effect [of class size] on reading achievement. There is a negative, but statistically insignificant, effect on mathematics.”

So, when our School Board again says it needs more money, ask them what for.  And if it’s for more teachers (indeed, whatever it’s for), ask them to prove that the money will improve learning.

Larger Classes, Same Performance

The estimable Jim Bacon has posted on the class size/student performance issue.  He used a graph I produced showing division average math pass rates vs. the number of teachers per thousand students.

Oops!  I used 2016 pass rates and 2017 teacher/student ratios.  Using the 2017 pass rates changes the graph slightly but does not modify the conclusions.

Here are the correct data for math and the other four subjects as well.  Pass rates are from VDOE’s very nice (but very slow) database; teacher data are from the Superintendent’s annual report.

First, reading:

The negative slope might suggest that increasing the number of teachers is associated with lowered SOL pass rates but the R-squared value tells us that the two variables are essentially uncorrelated.

Richmond is the gold square on the graph.  The red diamonds are the peer cities, from the left Newport News, Hampton, and Norfolk.  Charles City is green; Lynchburg, blue.

The Big Hitter up there is Falls Church, with a 92.6% pass rate at 119.5 teachers/thousand.  Next in line are West Point (135.0, 91.0) and  Poquoson (105.1, 90.7).

The data for the other four subjects tell much the same story.

More on “Accreditation” in Richmond

Under the new accreditation regulation, no Richmond school was denied accreditation this year.  No matter how badly a school has failed its students, the principal now can say his/her school is “accredited” (albeit 24 were accredited “with conditions”).

Three of the most egregious examples of that are here.  I’ve redone the graphs to present the entire list of schools that have acquired accreditation by fiat.  (Data: last year, this year.)

Note: Data here are for year tested; accreditation year is one year later.  Thus, the 2018 data in the graph are from the 2017-18 testing and establish (or would if they meant anything) accreditation status for 2018-19.

image

image

image

image

image

image

image

Note: The elementary schools don’t test for writing.

image

image

image

image

image

image

image

image

image

image

image

Then we have the schools that were “Accredited with Warning” last year but were saved from even that by the new system.

image

image

image

image

image

image

So much for “this new approach to accountability.”

Let’s Squander More Money!

The Times-Dispatch reports that “Richmond’s top officials spent their quarterly meeting again calling for more money for the city’s school system.”  This abiding demand for more money ignores the more important question:  What was RPS doing with the $335,290,809 it already had.

VDOE has some data on that.

The latest expenditure data are from 2017. (We are three months beyond the end of the 2018 session but VDOE won’t have the 2018 data until about the time we start seeing dandelions.)

These data show division expenditures for operations divided by the end-of-year average daily membership.  A  footnote to the spreadsheet tells us that “[o]perations include regular day school, school food services, summer school, adult education, pre-kindergarten, and other education, but do not include non-regular day school programs, non-local education agency … programs, debt service, or capital outlay additions.”

By that measure, Richmond is the 17th most expensive division per student:

image

Richmond is the yellow bar.  The peer jurisdictions – Norfolk, Newport News, and Hampton – are the red bars.

We are spending $1,396 per kid more than the state average and $1,881 more than Norfolk.  In terms of SOL scores, we get very little return for all that money.

image

Richmond is the gold square; the red diamonds are the peer jurisdictions, from the top Hampton, Norfolk, and Newport News.  As a courtesy to my reader(s) there, the green circle is Lynchburg and the green diamond is Charles City.

The R-squared value for the least squares fitted line tells us that the reading pass rate and the per pupil expenditure are not correlated.

The math data tell the same story.

image

“But wait!” you say.  We know that the SOL is not a fair measure because “economically disadvantaged” students do not score as well as their more affluent peers.

Indeed.  The Board of Education had a better measure, the Student Growth Percentile, that measured learning and did not correlate with poverty.  They abandoned it, however, because it measured too well: It told us how well each teacher performed.

Denied the best data, all we can do is use the SOL numbers and a little algebra to offset the average effect of poverty. 

On the reading tests in ‘17, the division percentage of “disadvantaged” students predicted about 35% of the variance of the division average SOL scores.

image

(You’ll notice that Richmond grossly underperformed even the fitted line.)

Let’s be generous, and calculate an adjusted pass rate as if the correlation were 100% (“Generous” for sure: Notice that some of the pass rates get pushed over 100%.)

image

The adjustment for Richmond’s 64.2% poverty rate boosts its rate nicely but only boosts the ranking to fourth from worst, up from second.

And, boosted or not, our scores are low and pricey:

image

The correction for math is slightly different (28.025% of the poverty rate v. 28.671% for reading) and the outcome is slightly worse (Richmond is third worst).

image

In short, poverty does not come close to explaining the awful performance of the Richmond Public Schools.

It would be good if our Leaders were to stop whining about wanting more money and start explaining why they get such lousy results with the very large amount of money they already are spending.

We Beat Petersburg!

In Richmond, the only way to brag about the schools has been to say “We beat Petersburg!”  We can say it again this year.

Barely.

For example, here are the division average reading pass rates.  The graphed data are rounded to the nearest whole number to allow calculation of the distribution; the data in the table are rounded to one decimal place.

image

The blue bar at 56 indicates that one division’s pass rate (it was Danville’s 56.25) rounded to 56.  The blue bar at 57 shows that one division rounded to that value; the red bar there tells us it was Petersburg.  Similarly, Richmond is the one division at 59.

The chart tells us that Richmond was 2.5 standard deviations below the division average, 77.0.  Petersburg was low by 2.8 standard deviations.

Here are the data for the other four subjects.

image

image

image

image

Finally, the average of the five averages.

image

Notwithstanding these dismal results, all the Richmond schools (except Carver, where they were caught cheating) are accredited.  But, then, so are all the Petersburg schools.

Meanwhile, the Richmond School Board is busy adopting a redundant free speech policy instead of telling us what they will do to improve the teaching.

Charter Middle School?

The Times-Dispatch has a story, “Richmond parents want a charter middle school.  It faces an uphill battle.” 

The snippet on the “Education” section of the Web page tells us the issue:

The power to approve a nascent effort to launch a public charter middle school in Richmond rests solely with officials wary of watching limited dollars needed for their own underperforming schools follow students elsewhere.

The data suggest that those “wary” School Board members and City Great Ones should think a little harder.

Let’s start with the distributions of fifth and sixth grade pass rates on the reading tests.

image

Here we see nine elementary schools with fifth grade pass rates that meet or beat the (former) 75% benchmark for accreditation on the reading tests: Munford, Fox, Southampton, Broad Rock, Stuart (now Obama), Holton, Patrick Henry, Cary, and Bellevue.  The only middle schools that make the same cut for the sixth grade are Alternative and Franklin Military. 

Relevant here, Richmond Alternative operates Spartan Academy, which “serves as a school to support students with academic, attendance and behavior challenges.”  Franklin is a different kind of specialty school; it does a decent job for selected students who elect “to experience a regular academic course of study while participating in a Junior Reserve Officer Training Program or Middle School Leadership Program.”

The highest-scoring non-specialty middle school on the graph is Binford, with a 69.5% pass rate, followed by Hill, 67.4%, and Brown, 62.6%.

The math data paint a similarly ugly picture.

image

Munford leads the fifth grade parade, followed by Cary, Fox, Fisher, Patrick Henry, Broad Rock, and Obama, all of which beat the nominal 70% benchmark for math. 

Franklin and Alternative 6th grade averages again beat the benchmark.  Next in line is Hill, at 65.2%, followed by Brown, 55.4%, and Binford, 54.1%.

(This is not to say that there aren’t serious problems in both elementary and middle schools.  Just look at those collections of pass rates below 50% in both subjects.)

To the point here, there are some Richmond elementary schools where a parent can send a kid while entertaining only the normal parental worries.  But come middle school, parents who can afford it have a good reason to opt for a County or private school.

And opt out they do: The Richmond enrollment plummets after the fifth grade.

image

Or, in terms of the raw enrollments:

image

(That ninth-grade “bump” is a national phenomenon that appears to reflect laissez faire promotion policies in the lower grades.  Most of the drain in Richmond after the 9th grade is dropouts.)

So, the question for our city and school board is more nuanced than just watching dollars flow to a charter middle school:  Would having a decent middle school help stanch the current flow of middle school funds out of the Richmond school system?

College Graduation Rates v. SAT Scores

We have seen that the graduation rates of our 4-year public colleges correlate well with the SAT scores of their freshmen.  Let’s return to that subject with some more recent data.

The most recent SCHEV cohort graduation data are from the 2012-13 first-time, full time freshmen.  SCHEV also has SAT data for entering freshmen that year.

Here, then, for our 4-year public colleges, are the 4-year graduation rates for that 2013 cohort plotted v. those median SAT scores (math + verbal).

image

The correlation is solid:  Entering SAT scores, on average, are an excellent predictor of that graduation rate (recalling, always, that the correlation does not prove causation).

Three schools considerably outperform the fitted line: James Madison and Longwood at +11 and VMI at +7.  The three largest underperformers are Old Dominion (-13), George Mason (-11), and VCU (-7).

Turning to the five-year rates:

image

The outperformers on the five-year rate are JMU (+12 from the fitted line), Longwood (+10), VMI (+7), and Radford (+4).  The underperformers are Old Dominion (-9), Norfolk State (-8), Wm. & Mary and George Mason (-5), and Mary Washington (-4). 

We might think that outstanding performance at the 4-year level would hinder performance at five years; JMU, VMI, and Longwood (and, to a lesser extent, Radford) all belie that notion.

The five-year rates are remarkably higher, even at THE University and W&M, with the increases generally larger at the schools with the lower rates.

image

Remember: The cohort data are for full time, first-time freshmen.  We might think that these rates would not reflect the larger part-time populations at the urban universities; we might be wrong in that.

BTW: Looking just at graduation rates, we see there is one (private) school that beat even THE University (on the 4-year rate, UVa, 88%; W&L, 93%).

image

We could argue endlessly about the causes for these correlations and differences.  The better question, I suggest, is what all these schools might do to improve their graduation rates:  On average, 45% of those full time, first-time freshmen entering 4-year public colleges did not graduate in four years; 28% did not make it in six. 

In terms of people:  About 14,450 (“about” because of the roundoff of the rate data) of 32,112 first-time, full time freshmen entering 4-year public universities in the fall of 2012 did not graduate in four years; about 8,990 did not make it in six years.