How About Those Salaries?

As a further look at the effect of expenditures on performance, here are the 2015 division average reading pass rates v. the 2015 division average budgeted teacher salaries (the actual 2015 salary data won’t be available until around the first of the year).  Data for Accomack are missing for lack of a report to VDOE.

image

Richmond is the gold square; the blue circle is the statewide division average.

The fitted curve suggests that an additional $10,000 average salary is associated with a 2% increase in the pass rate but the R2 tells us that the two variables are essentially uncorrelated.

The math data paint a similar picture.

image

Of course, we know that increasing economic disadvantage of the student population is associated with lower pass rates.  We can account for the average effect by using the correlation between pass rate and economic disadvantage to normalize the pass rates, i.e., express the pass rates as percentages of the economic disadvantage trendline rates.   That produces these graphs:

image

image

Again, only minuscule correlations.  And the fitted curves, to the extent they mean anything, say “no benefit from the higher salaries.”

So it seems that the divisions that pay their teachers more do not get better SOL performance; they merely pay more for the performance they get.

Finally, here for two of my faithful readers (maybe the only two) are the last two graphs showing the results for Charles City (purple circle) and Lynchburg (red circle).

image

image

Data are posted here.

JLARC Punts

The Joint Legislative Audit and Review Commission (JLARC) has just published its draft report “Efficiency and Effectiveness of K-12 Spending.”  Unfortunately, that report does not even look carefully at where Virginia is spending its educational dollars, much less answer the (much harder) question of what we are getting for that money.

The Mandates

The General Assembly gave JLARC two decently clear mandates.

SJR328 (2013): JLARC shall

study the efficiency and effectiveness of elementary and secondary school spending in Virginia.  [It] shall (i) study the efficiency and effectiveness of elementary and secondary school spending in Virginia, including evaluating the findings from School Efficiency Reviews and assessing the extent to which recommendations have been implemented; (ii) compare to other states how and to what extent Virginia funds elementary and secondary education; and (iii) identify opportunities to improve the quality of education students receive in consideration of the funds spent.

2014 Appropriation Act, Item 30 (at p. 62 of the link): JLARC to examine virtual instruction, to include “effectiveness of  virtual schooling in terms of  student academic achievement outcomes on assessment tests and course completion or graduation rates.”

The “Study”

The result is a 112 page draft that ignores the instructions of the General Assembly. 

Of the nine recommendations, six talk about efficiency; half of the six deal with school busses; only one of the six deals with something that relates to education.  None tells us about the educational effectiveness of our school spending or how to improve it:

  1. Track teacher turnover.
  2. Provide facilities management expertise.
  3. Provide “guidance” regarding sharing information about facilities management best practices.
  4. Consider statewide contract for bus routing and monitoring software.
  5. Provide transportation management expertise.
  6. Assist with transportation management best practices.

As to virtual schooling, JLARC again avoids answering the question.  The three recommendations:

  1. Provide information about online schools.
  2. Estimate costs of online learning.
  3. Compare achievement of virtual v. physical schools

That last one is particularly rich: JLARC is recommending that VDOE do what the General Assembly told JLARC to do.

Cranky’s Conclusion

This study is a wordy waste of money.  It does not answer the questions posed by the General Assembly.  Instead, it raises a new question: Why are we paying JLARC to not do what it’s been told to do?

A Reader’s Conclusion (added on 9/17)

A reader suggests an alternate (and more pertinent) conclusion: Why are we paying JLARC not to do what it’s been told to do, when we already are paying VDOE that should be doing [what JLARC failed to do]?

New (Federal) College Data

USDOE has just posted a considerable trove of college data.

CAVEAT:  These data are mostly for students who received federal financial aid. 

  • “Average Annual Cost”: The average annual net price for federal financial aid recipients, after aid from the school, state, or federal government. For public schools, this is only the average cost for in-state students.
  • “Graduation Rate”: The graduation rate after six years for schools that award predominantly four-year degrees and after four years for all other schools. These rates are only for full-time students enrolled for the first time.
  • “Salary After Attending”: The median earnings of former students who received federal financial aid, at 10 years after entering the school.

My quick reading of the data does not disclose what fraction(s)of the student populations are represented here. 

With that warning, here is a look at the Virginia public and not-for-profit colleges.  First the graduation rates:

image

The winners there are UVa in red, W&L in yellow, and W&M in green.

Next, the median salary ten years out:

image

W&L, in yellow, is the big winner here.

Finally, a bang/buck calculation, ((Salary * Graduation Rate) / Average Cost):

image

Colors, as before, are UVa in red, W&L in yellow.

Here is the dataset, sorted by school name.

image

You might be interested in comparing these data with the results of the Brookings “value-added” study.

Anatomy of a Lousy Performance: SOLs by School

Here are the 2014 and 2015 Reading pass rates by Richmond school:

image

image

image

You may recall that the accreditation benchmark for reading is a 75% pass rate.  VDOE cooks the accreditation numbers so thoroughly that the 75% criterion may be interesting as a rule of thumb but it is meaningless as to which schools actually get accredited.  You’ll notice that none of the mainstream middle schools and far too few of the elementary schools made 75% this year.  Indeed, King went from unspeakably bad to worse, never mind anything to do with 75%.

For another, perhaps more useful, measure, the statewide average reading pass rate was 79.0.

Franklin has both middle and high school grades so I’ve included it in both lists, although its scores can’t be directly compared to either.

Carver continued its spectacular performance, leading among the (only) six elementary schools to beat 75%. 

Next the math data.  Recall that the accreditation criterion is 70%.  The state average pass rate this year was 79.4.

image

image

image

Notice the decreases at Open and Marshall, as well as the uniformly miserable pass rates of the middle schools.  Note the several elementary schools doing pretty well, led again by Carver. 

SOL v. Cost

Table 13 in the Superintendent’s Annual Report lists annual disbursements by division.  Unfortunately, we only have the 2014 data; the current data ordinarily don’t come out until the following Spring.

Deleting the facilities, debt, and contingency entries, and juxtaposing the resulting disbursement totals with the 2015 Reading SOL Pass rates, produces the following graph.

image

Richmond is the gold square.  The red diamonds are, from the left, Hampton, Newport News, and Norfolk.  Thus we see the comparable old, urban jurisdictions performing poorly at about average cost while Richmond’s reading performance is much worse at a much higher cost.

The datum up there at $11,127, 23% less expensive than Richmond, is West Point, with an 87.8% pass rate.

The R2 value of 2.3% tells us that, among the Virginia school divisions, reading performance and cost per student are essentially uncorrelated.

The math data paint a similar picture.

image

The division pass rates again fail to correlate with expenditure. 

The point up top ($11,127, 89.0%) is West Point, again.

These data say, quite clearly, that Richmond’s education establishment should stop whining about money and start educating the City’s children.

New SOL Data, Continued . . .

The excuse we often hear for Richmond’s poor performance on the SOL tests is poverty.

VDOE has data on that.  They define a student as “economically disadvantaged” if that student “1) is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible for Medicaid, or 4) [is] identified as either Migrant or experiencing Homelessness.”  Data are here.

Juxtaposing the 2015 Division pass rates with the ED percentage of the enrollment, we see the following for the reading tests:

image

With an R2 of 0.5, it appears that ED is a reasonably good predictor of Division reading pass rates. 

Richmond is the gold square on the graph.  The red diamonds are the comparable old, urban jurisdictions: From the left, Hampton, Newport News, and Norfolk.  The yellow points are the outstanding performers: From the left, West Point, Wise, Norton, and Highland.  Notice that Norton and Highland are outperforming about as much as Richmond is underperforming, with about the same level of poverty.

Turning to the math tests, the correlation drops but the pattern is much the same:

image

Richmond again is the gold square; the red points again are Hampton, Newport News, and Norfolk.  Norton drops out of the yellow outperforming group, leaving West Point, Wise, and Highland.

Looks to me like Richmond needs a better excuse than poverty.