No SATisfaction

Until yesterday, if you were to look on the RPS Homepage, under “About RPS,” you would have found a list of SAT scores that stopped in 2016.  The estimable Carol Wolf asked RPS when they planned to update the data; Ms. Hudacsko of RPS got things straightened out.

Let’s look at those updated data.  First the reading/writing numbers:

image

*Before 2016, the SAT had separate reading and writing scores; since then, they report a single “reading and writing” number.  The data above (and below) are those combined scores for 2016 and later but only the reading scores before then.

Of course, some schools did better than others.

image

Turning to math:

image

image

It’s not Good News that our best two high schools underperformed the state average on both tests.

For some context, here are data on this year’s admissions at Longwood

clip_image001

and at THE University.

clip_image001[4]

To get some further context, I consulted Dr. Google and found a shortage of posted SAT scores for individual schools.  Fairfax is a notable exception (when you look at the numbers, below, you’ll see why).  Here are their 2018 totals (reading/writing + math), along with the same data for the Richmond high schools and for Richmond, Virginia, and the US.

image

Now, class: In 25,000 words or fewer, explain how that performance comports with the RPS Web page that says:

image

RPS Embarrassing Itself

The RPS Web site has a chart:

image

The chart tells us the information there is

image

No telling why they would post old data.

In any case, the “success” headline is a lie.  A quick run to the VDOE database measures the size of that lie: On the 2015 SOL tests, just announced in August, 2016, Richmond was lowest in the state in terms of the reading pass rate and second lowest in writing, history & social science, math, and science.

image  image image

image

image

If that is “success,” I am Bill Gates.

Returning to the chart, we see this picture of RPS demographics.

image

Hmmm.  That 1.33% Asian slice is smaller than the 0.06% Hawaiian and much smaller than the 0.16% Am-Indian.  Let’s see what Excel does with the same numbers:

image

OK.  The graph is wrong but, unlike the “success” lie, there’s no benefit to it.  It just demonstrates incompetence in graphing.

Then we have arithmetic:

image

Thing is, if you add up the numbers, it’s 43 schools, not 44.

According to the Chart, RPS is spending more than a third of a billion dollars a year.  For that price, in one chart on their Web site they manage to tell a blatant lie, proffer a nonsense graph, and demonstrate that they have not mastered simple arithmetic.

Your tax dollars at “work.”

Passing the Kid (and Passing the Problem On)

Table 7 in the 2018 Superintendent’s Annual Report gives us, among other things, a count of the number of public school students retained in grade from 2017.  The short answer: A few in the first two grades and then hardly any until high school.

image

Exprerssed as the percentage of students, the graph looks much the same:

image

This greasing of the academic skids until the student reaches high school, but only until then, surely is related to the “ninth grade bump” that looks to be a nationwide phenomenon.

image

The principal argument for social promotion seems to be that holding the kid back is even less effective for learning than promoting that student.  The literature I’ve seen does not explain why that reason (or simple inertia or moving the problem on to the next level or whatever else might explain social promotion) stops at the ninth grade.

Accreditation Euphemism

The School Quality Profiles on the VDOE Web site offer an interesting look under the hood of the new accreditation process.

There’s no central source of the numbers so one must pluck them out, subject by subject and school by school.  The English data for Richmond took a chunk of an afternoon.

Here are those English data.  First, the elementary schools:

image

The red line at 75% marks the nominal (“adjusted”) accreditation benchmark, “Level 1” under the regulation.  The purple marks the new, good-enough-for-four-years “Level 2” at 66%.  The various “adjustments” [subsection F.1] are color coded atop the blue pass rates.

As you see, only four schools made Level 1 directly; with the adjustments, add another eight.  Four schools were adjusted into Level 2; eight others did not make Level 2, even with the help of the adjustments. 

There are other adjustments, however.  For example, Miles Jones made Level 1 on the three year average; Bellevue and Fairfield Court made Level 2 on that average. 

(The Greene “Growth” and “English Language” boosts are remarkably large.  FWIW, the growth boost, while extravagant, looks to be in line with other low-performing schools.  The database tells us that 220 of 303 Greene students taking the reading tests, 70%, were English Learners; those EL students passed at a 28.18% rate while 65.59 of the non-EL passed.  And, boost or no, Greene was accredited on the three year average.)

After everything, the elementary school accreditation data look like this (with math, science, and absenteeism shown along with English but the “achievement” gaps and “Accredited on 2015 Standards” items omitted) :

image

Miles Jones is anomalous there.  It looks fine on the items in this list but still is “with Conditions.”  Among the items not in the figure, Jones hit L3 on both math and English for its disabled students as well as math for black students and the math “achievement gap.”

Next the middle schools.

image

image

Hill looks fine on this figure but was L3 on the achievement gap and performance of disadvantaged and disabled students in both English and math, as well a the math performance of its black students.

The VDOE spreadsheet shows Brown at L3 in English; the School Quality Profiles show L2.  I can’t reconcile the two.  In any case, there are some L3’s in the achievement gaps that would explain the “with Conditions” at Brown.

The high schools look good, at least until you look past the reading scores.

image

clip_image001

And the specialty schools look good, period.

image

image

Here are the 2017 and 2018 Richmond lists (omitting Amelia Street and Richmond Career).

image

Or, in summary (leaving Carver out):

image

The old accreditation process was riddled with “adjustments” that improved the scores and allowed more schools to be accredited.  Even so, one Virginia school in twenty was denied accreditation in 2017.  The new regulation solves that problem.  In addition to providing a spectrum of helpful “adjustments” to boost the scores, the regulation provides:

If a school is [in such bad shape that even the adjustments don’t save it] and the school or school division fails to adopt and implement school division or school corrective action plans with fidelity as specified by 8VAC20-131-400 D, it may be designated by the board as “Accreditation Denied” as provided in 8VAC20-131-400 D 4. [Emphasis supplied.]

In short, as long as a school makes nice to the Board of Education, it won’t be denied accreditation.

Looking at the data above, the conclusion is clear: “Accreditation Denied” has gone away; “Accredited with Conditions” is the new, more Superintendent-friendly euphemism that means roughly the same thing. 

Thus, if we read the new system for what it means, not what it says, it might even be taken as an improvement on the old, bogus system.  Let’s call it the new, bogus system.

But, then, “Board of Education” is itself a euphemism, if not an outright lie (for example, see this and this and this and this) so there’s no reason to expect honest reporting from their “accreditation” process. 

Even more to the point, that Board has demonstrated (and members have admitted [Sept. 21, 2016 video starting at 1:48]) that they don’t know how to fix awful schools such as those in Petersburg.  So the labels they apply to failing schools provide bureaucratic camouflage: Those schools, and our Board of “Education,” can put on a happy face while they persist in failing to educate far too many children.

Why You Might Want a Woman for Your Kid’s English Teacher

. . . or for your lawyer if she went to school in Virginia, but not if she went to school in Richmond:

image

On the other hand, it might be OK for your next Doc to be a male (but, again, not if he went to school in Richmond).

The End of Course pass rates also suggest a preference for the female lawyer or English teacher and a slight preference for the female doctor, if she didn’t go to school in Richmond.

image

You may remember that the large Richmond population of economically disadvantaged students imposes about a 5% penalty on the Richmond SOL pass rates.  Unfortunately, adding that much to these Richmond numbers does not affect the recommendations.

How to “Correct” Lousy Test Scores

An earlier post  discusses the pass rate gap between Economically Disadvantaged students and their more affluent peers and how the revised SOL tests have increased that gap.  While the draft of that post was marinating overnight on the hard drive, my old, slow brain finally connected the dots: The real problem with those tests was not pass rates; it was accreditation.  The lower pass rates under the new, tougher tests were spawning a plethora of schools that were denied accreditation.

image_thumb[1]

The three-year rolling average “adjustment” to the accreditation scores delayed the effect of the revised tests, but by 2016 one school in twenty was “denied” and some older cities were accreditation disaster areas.  The Board of “Education” was failing in its prime mission: Looking good in the glow of schools that were doing well.

The Board has demonstrated, and Board members have admitted (Sept. 21, 2016 video starting at 1:48), that they don’t know how to fix those awful schools.  To solve this problem, then, the Board chose to (further) corrupt the accreditation process.

Under the new, byzantine accreditation regulation, the process only starts with a pass rate; that rate then gets “adjusted.”  Among the more flagrant adjustments:

  • Students in “recovery,” i.e., who flunked last year but pass this year after taking the remediation program, count as two passes;
  • Students who fail but show “growth” count as if they had passed (there is no deduction for students who show shrinkage); and
  • English learners who show “progress” count as passes.

If that were not enough, there is a new, reduced accreditation benchmark, aka “Level 2.”

For an example, in Petersburg this year the English adjustments added fifteen points at Lakemont and twelve at Pleasants Lane.

image1_thumb

But, as the Petersburg data show, even that is not enough for some schools.  So, under the new regulation, even a school that fails miserably after all the adjustments cannot be denied accreditation unless it “fails to adopt and implement . . . corrective action plans with fidelity.”  And, even then, the school only “may” be denied accreditation. 

Translated from the bureaucratese:  A school will be accredited, no matter how terribly it performs, so long as it does not tell the Board of “Education” to go eat one off a twig.

Problem solved!  No matter the SOL numbers, every principal (except at Carver, where they got caught cheating) can now say: “Our school is accredited.”  And the Board can claim credit for that educational triumph.

Your tax dollars at “work.”

Money and Economic Disadvantage, Refrain

An earlier post examined the performance gap between Economically Disadvantaged (“ED”) students and their more affluent peers (“Not ED”).  The estimable Jim Bacon commented on the post (TY, Kind Sir!).  The likewise estimable Steve Haner (a former colleague who is a Fine Fellow, although sometimes nearly as cranky as yours truly) posted a comment to the Bacon post, pointing out that the performance gap might well support a demand for more school funding.

We have some data on that.

To start, here is the 2017 division average Gap, ED pass rate minus Not ED, plotted vs. the division day school disbursement per student (the most recent disbursement data are 2017).

image

The slope of the fitted line (ca. -0.2% per thousand dollars) suggests that the gap increases with increasing day school expenditure but the R-Squared value tells us that the two variables are not correlated.

Richmond is the yellow square point.  The red diamonds, from the left, are the peer cities Hampton, Newport News, and Norfolk.  Charles City is green; Lynchburg, blue.

The anomalous datum at $15,916 and +1.79 (ED students outperforming the Not ED) is Sussex.  The nearly-as-anomalous ($19,468,-0.66) is Bath.

The bottom line: Some divisions do better; some do worse; the amount of money doesn’t correlate with the performance difference.

The math data tell much the same story.

image

None of the other subjects suggests otherwise.

image

(Notice the several divisions where the ED students outperform on the writing tests.)

image

image

At the division level, these data say that, on average, the Not ED students considerably outperform the ED students on the SOL tests and that more money per student does not relieve that difference.

Why Do the New Tests Punish the Poorer Kids?

The Board of Education revised the History & Social Science SOL tests in 2011, the Math tests in 2012, and the Science and English tests in 2013

The pass rates dropped statewide upon introduction of the new tests, and then recovered.  Some.

image

Most recently, all the pass rates look to be sliding.  If, as we are told, the test difficulty has remained constant, this seems to spell Bad News for Virginia’s public school system.

Even more problematic, however, is the discriminatory effect of the new tests: Economically Disadvantaged (“ED”) students considerably underperformed their more affluent (“Not ED”) peers under the old tests; the new tests exacerbate the difference. 

Let’s look at the two most important tests, reading and math.  Here are the state average pass rates for ED and Not ED students for the period of the VDOE public database.

image

To emphasize the performance gaps, we can look at the pass rate differences, ED minus Not ED.

image

Immediately before the new tests, the ED/Not ED difference on the math tests was around twelve to thirteen points; it now is sinking toward twenty.  On English, ca. thirteen or fourteen points before and now approaching twenty-two.

There are at least three possible explanations:

  • Our schools had a hard time with ED students under the old tests; those schools are even less effective under the new, tougher tests; or
  • The ED students are less capable test takers than the Not ED, whether by genetics and/or environment, and that deficit is magnified by the greater rigor of the new tests; or
  • The new tests discriminate against ED students.

Take your pick of any one or combination.  These data won’t separate those possibilities.

But there is one obvious conclusion: Before the new tests, ED students did not perform as well as their more affluent peers.  The new tests magnify that deficit.  The Board of Education has found a way to penalize our less affluent students.

As well, this system punishes the divisions with larger populations of ED students.

For instance, Richmond:  On the reading tests in 2018, the pass rate of Richmond’s ED students was 14.68% below the state average for ED students; Richmond’s Not ED, 13.73 points below.  Any rational scheme for evaluating Richmond would put it just over fourteen points low.  Yet, because Richmond’s pass rate averages are dominated by its 61% ED population, the division pass rate was 20.15 points below the state average.

Here are those data as a graph:

image

The math data tell the same story: Richmond suffers about a 5% penalty because of its larger ED population.

image

(Of course, none of that excuses Richmond’s appalling failure to educate both its ED and its Not ED students.)

Our Board of “Education” had a measure of learning, the SGP, that was insensitive to students’ economic condition.  The Board abandoned that system.  Their excuse was that the results are not available until all the data are in, near the end of the summer.  Yet, they knew that when they adopted the SGP.

Whatever the real motive, the Board abandoned the evenhanded SGP in favor of the present scheme that punishes both the ED students and the school systems with larger populations of ED students.

Your tax dollars at “work.”

More Money for What?

The RT-D reports a march to the Capitol “demanding more funding for public education.”  The story dwells on recent cuts in funding for schools but mentions only one specific problem to be remedied, “a giant rat in the middle of the gym floor.”

The notion of buying better schooling is easy to understand but there is precious little evidence that it works in practice.  To the contrary, we have seen that the SOL pass rates in Virginia’s school divisions are quite unrelated to the divisions’ expenditures. 

In the other direction, WashPost reports a study that found an effect of increased expenditures, suggesting that benefits were most obvious for students from poor families. 

If larger expenditures indeed benefit less affluent students, the Virginia data should give us a yardstick: VDOE posts data for students who receive free and reduced price lunches.  And the SOL pass rates provide a measure of what counts – output – not what’s easy to count – inputs. 

To start, here are the 2017 division average reading pass rates for students who are economically disadvantaged (“ED”) or are not (“Not ED”), plotted v. the division expenditure per student for the day school operation (We have to go back a year because the 2018 expenditure data are not yet up on the VDOE Web site).

image

No correlation.  Zip.  Nada.  Whatever those expensive divisions are getting for their money, it’s not performance (by this measure), either of the ED or the Not ED students.

How about math?

image

The math data might suggest that performance actually decreases with in increasing expenditures, but the R-squared values are much too small to draw any conclusions.

Of course, much of the the day school spending goes to administration, health services, transportation, and O&M, things that may or may not be related to students’ performance.  So let’s turn to salaries: Table 19 in the Superintendent’s Annual Report gives us the division average instructional salary.

The reading data give mixed results but, again, not enough correlation to draw any conclusions.

image

The math data sing the same song.

image

In sum: At least as to across-the-board salary averages, the divisions that pay more – up to over twice as much – don’t get better reading or math SOL pass rates.

Yet the Governor says he wants to raise teacher pay.  He does not tell us how that money might translate into better learning.  Of course, he is a politician and he may have something other than learning in mind.

Class size is another popular consideration (again, however, a measure of inputs, not of learning).  Table 17 gives us the number of instructional positions per thousand students; I’ve recalculated to number per hundred. 

Starting again with reading:

image

The salary range there covers a ratio of 2.75.  The performance does not correlate.

How about math?

image

Same story.

It is clear that, in terms of division averages, the per student expenditure, teacher salary, and number of teachers per student have no power to predict average SOL pass rates, either of ED or Not ED students.  In light of that, it would be helpful for the marchers and other folks who want to spend more on schools to explain exactly how the additional funds they advocate would improve student performance and, for sure, how that effect would be measured.

—————————

Note added on 12/12:

The estimable Jim Bacon points out that the large NoVa divisions might bias the data because of the high cost of living up there.

Fair enough.  Let’s repeat the analysis with those Big Guys (Alexandria, Arlington, Fairfax, Loudoun, Prince William, Spotsylvania, and Stafford) removed:

image

image

image

image

image

image

You might notice that the Big Guys disappear at the high salary end but mostly are down in the pack as to expenditure per student and positions per hundred.  In any case, the data tell the same story without the NoVa divisions as with them.