In the Accreditation Basement

VDOE has posted the 2016 Accreditation Ratings, based on the 2015 test scores.

I’ll have more to say later about VDOE’s manipulation of the Accreditation Ratings, to include the newly minted “junior is flunking but by less than before” ratings.  For now, here are the Richmond results.



It’s hard to know what all those “TBD” entries mean.  I’ll have a look at the pass rates and post them here, soon.  For sure, 38% fully accredited is not good news.

And, also for sure, Thompson was denied accreditation last year but the newly-minted Elkhardt-Thompson is getting a free pass.

Preliminary Graduation Data

VDOE is out today with a press release bragging on the increased On-Time graduation rate

The VDOE Web site was down today until suppertime; I will not have time to analyze the data until tomorrow.  Until then, here are some early data (“Actual” rate refers to the advanced+standard diploma rate; “on-time” includes counts of those diplomas plus the modified standard, special, and general achievement  diplomas):



How About Those Salaries?

As a further look at the effect of expenditures on performance, here are the 2015 division average reading pass rates v. the 2015 division average budgeted teacher salaries (the actual 2015 salary data won’t be available until around the first of the year).  Data for Accomack are missing for lack of a report to VDOE.


Richmond is the gold square; the blue circle is the statewide division average.

The fitted curve suggests that an additional $10,000 average salary is associated with a 2% increase in the pass rate but the R2 tells us that the two variables are essentially uncorrelated.

The math data paint a similar picture.


Of course, we know that increasing economic disadvantage of the student population is associated with lower pass rates.  We can account for the average effect by using the correlation between pass rate and economic disadvantage to normalize the pass rates, i.e., express the pass rates as percentages of the economic disadvantage trendline rates.   That produces these graphs:



Again, only minuscule correlations.  And the fitted curves, to the extent they mean anything, say “no benefit from the higher salaries.”

So it seems that the divisions that pay their teachers more do not get better SOL performance; they merely pay more for the performance they get.

Finally, here for two of my faithful readers (maybe the only two) are the last two graphs showing the results for Charles City (purple circle) and Lynchburg (red circle).



Data are posted here.


The Joint Legislative Audit and Review Commission (JLARC) has just published its draft report “Efficiency and Effectiveness of K-12 Spending.”  Unfortunately, that report does not even look carefully at where Virginia is spending its educational dollars, much less answer the (much harder) question of what we are getting for that money.

The Mandates

The General Assembly gave JLARC two decently clear mandates.

SJR328 (2013): JLARC shall

study the efficiency and effectiveness of elementary and secondary school spending in Virginia.  [It] shall (i) study the efficiency and effectiveness of elementary and secondary school spending in Virginia, including evaluating the findings from School Efficiency Reviews and assessing the extent to which recommendations have been implemented; (ii) compare to other states how and to what extent Virginia funds elementary and secondary education; and (iii) identify opportunities to improve the quality of education students receive in consideration of the funds spent.

2014 Appropriation Act, Item 30 (at p. 62 of the link): JLARC to examine virtual instruction, to include “effectiveness of  virtual schooling in terms of  student academic achievement outcomes on assessment tests and course completion or graduation rates.”

The “Study”

The result is a 112 page draft that ignores the instructions of the General Assembly. 

Of the nine recommendations, six talk about efficiency; half of the six deal with school busses; only one of the six deals with something that relates to education.  None tells us about the educational effectiveness of our school spending or how to improve it:

  1. Track teacher turnover.
  2. Provide facilities management expertise.
  3. Provide “guidance” regarding sharing information about facilities management best practices.
  4. Consider statewide contract for bus routing and monitoring software.
  5. Provide transportation management expertise.
  6. Assist with transportation management best practices.

As to virtual schooling, JLARC again avoids answering the question.  The three recommendations:

  1. Provide information about online schools.
  2. Estimate costs of online learning.
  3. Compare achievement of virtual v. physical schools

That last one is particularly rich: JLARC is recommending that VDOE do what the General Assembly told JLARC to do.

Cranky’s Conclusion

This study is a wordy waste of money.  It does not answer the questions posed by the General Assembly.  Instead, it raises a new question: Why are we paying JLARC to not do what it’s been told to do?

A Reader’s Conclusion (added on 9/17)

A reader suggests an alternate (and more pertinent) conclusion: Why are we paying JLARC not to do what it’s been told to do, when we already are paying VDOE that should be doing [what JLARC failed to do]?

New (Federal) College Data

USDOE has just posted a considerable trove of college data.

CAVEAT:  These data are mostly for students who received federal financial aid. 

  • “Average Annual Cost”: The average annual net price for federal financial aid recipients, after aid from the school, state, or federal government. For public schools, this is only the average cost for in-state students.
  • “Graduation Rate”: The graduation rate after six years for schools that award predominantly four-year degrees and after four years for all other schools. These rates are only for full-time students enrolled for the first time.
  • “Salary After Attending”: The median earnings of former students who received federal financial aid, at 10 years after entering the school.

My quick reading of the data does not disclose what fraction(s)of the student populations are represented here. 

With that warning, here is a look at the Virginia public and not-for-profit colleges.  First the graduation rates:


The winners there are UVa in red, W&L in yellow, and W&M in green.

Next, the median salary ten years out:


W&L, in yellow, is the big winner here.

Finally, a bang/buck calculation, ((Salary * Graduation Rate) / Average Cost):


Colors, as before, are UVa in red, W&L in yellow.

Here is the dataset, sorted by school name.


You might be interested in comparing these data with the results of the Brookings “value-added” study.