New (Federal) College Data

USDOE has just posted a considerable trove of college data.

CAVEAT:  These data are mostly for students who received federal financial aid. 

  • “Average Annual Cost”: The average annual net price for federal financial aid recipients, after aid from the school, state, or federal government. For public schools, this is only the average cost for in-state students.
  • “Graduation Rate”: The graduation rate after six years for schools that award predominantly four-year degrees and after four years for all other schools. These rates are only for full-time students enrolled for the first time.
  • “Salary After Attending”: The median earnings of former students who received federal financial aid, at 10 years after entering the school.

My quick reading of the data does not disclose what fraction(s)of the student populations are represented here. 

With that warning, here is a look at the Virginia public and not-for-profit colleges.  First the graduation rates:

image

The winners there are UVa in red, W&L in yellow, and W&M in green.

Next, the median salary ten years out:

image

W&L, in yellow, is the big winner here.

Finally, a bang/buck calculation, ((Salary * Graduation Rate) / Average Cost):

image

Colors, as before, are UVa in red, W&L in yellow.

Here is the dataset, sorted by school name.

image

You might be interested in comparing these data with the results of the Brookings “value-added” study.

No SATisfaction

VDOE has posted the 2015 Virginia average SAT scores.  As of today (9/11/15), RPS has not. 

While I was looking for scores, I found a list of SAT scores for State universities in Virginia.  The site does not date those.  Given that the scores do not change a lot from year to year, I thought it might be interesting to juxtapose the university data with the 2014 Richmond scores.

Here, then, are the 25th and 75th percentile reading scores of students entering our public universities, along with the state and Richmond averages for college-bound students as reported by RPS:

image

Notice that this is an apples-and-oranges comparison.  That said, the state average for college-bound students is close to the 25th percentile scores of entering students at Mason, VMI, and Christopher Newport.  The Richmond average is fifty points below the 25th percentile at Longwood.

And here are the math scores:

image

Requiem for the VGLA

I have written at length about Richmond’s abuse of its students with disabilities in order to improve SOL scores.  The 2015 data help complete a post mortem on that outrage so here is one last set of data.

Starting in 2005, VDOE allowed the divisions to offer a home-brewed (and, most importantly, locally graded) alternative test for students who could perform at grade level but whose disability interfered with taking the written SOL tests, the VGLA

You might reasonably think that only a few kids in any division would need that accommodation.  In fact, the use of the VGLA mushroomed.  One teacher quoted her director for the reason: “My dog could pass VGLA.”

In 2009, the Superintendent in Buchanan County admitted he had “encouraged to use of VGLA as a mechanism to assist schools in obtaining accreditation and in meeting AYP targets.”  Instead of firing that Superintendent for flagrant cheating and child abuse, VDOE merely required him to write a “Corrective Action Plan.”

Indeed, despite having a computer full of data showing abuse of the VGLA in Richmond and elsewhere, VDOE remained deliberately ignorant of the problem until the introduction of HB304 in the 2010 General Assembly, requiring a specific justification for every child taking the VGLA.  At that point, the State Superintendent became “concerned” and VDOE promulgated new math tests (2012) and reading tests (2013) that eliminated the VGLA except for some ESL students.

The new tests were tough; they reduced pass rates in most divisions. 

image

The disappearance of the VGLA also had a dramatic effect on the pass rates for students with disabilities.  The effect in Richmond was even more dramatic.  Data here are pass rates.  The red curve is Richmond students with disabilities divided by the state average of students with disabilities; the blue is the same ratio for students without disabilities.

image

Here we see Richmond’s students with disabilities outperforming their peers statewide(!), while Richmond’s students without disabilities underperformed.  Then came the new reading tests in 2013 and Richmond’s students with disabilities had the locally-graded VGLA snatched away and replaced by the same test everybody else was taking.  The Richmond students with disabilities suddenly underperformed their peers statewide by even more than Richmond’s students without disabilities were underperforming.

The math scores show the same effect.

image

The important, and shameful, outcome here is that no division Superintendent and nobody at VDOE went to jail or even was fired.  After the General Assembly put up a big Stop sign, VDOE merely changed the system.  And the bureaucrats in Buchanan and Richmond and, doubtless, elsewhere who were violating the public trust were left to think up new ways to abuse the children in their care.

And the kids who were harmed by this cynical and disgraceful episode were left to fend for themselves.

Your tax dollars at work.

Richmond Schools by Grade

The wonderful VDOE database can give us pass rates by test subject by grade.  For a single school, here Westover Hills, the data look something like this:

image

image

And here is Lucille Brown:

image

image

But there’s no way to fit all the data on one graph, or even on a few.  So I have posted the spreadsheet.

After you open the spreadsheet, select the reading or math tab and follow the instructions there.  Please let me know if you have a problem: john {at} crankytaxpayer [dot] org.

The spreadsheet is here.

High Schools

Finally, here are the End of Course pass rates at Richmond’s high schools for the reading and math tests:

image

image

Notice the huge decrease in scores caused by the new reading tests in ‘13 and the appalling drop attending the new math tests in ‘12.

These data must be taken with a front end loader full of salt because of Richmond’s remarkably high retest rate that VDOE kept hidden until a judge made them cough up the 2014 SGP data.  Here, for example, are the 2014 algebra I retest counts in Richmond.

https://calaf.org/wp-content/uploads/2015/03/image59.png

My analysis of those data showed that the retests increased the SOL scores of the students involved by an average of 24.6 points on a test where 400 was passing (i.e., by 6.1% of the passing score).

This year, for the first time, retests were available for elementary and middle school students.  Even with that boost (about 4% according to VDOE), we have seen the Richmond middle school and far too many elementary school scores endured at appalling levels.

Richmond Middle Schools

Here are the average reading pass rates, by year, of the Richmond middle schools.

image

Recalling that the Accreditation Benchmark for reading is 75%, we see our highest scoring middle school five points below that mark.

Recalling further that Thompson was the only Richmond school denied accreditation last year, and noticing that Thompson almost looks good in comparison to King and Henderson, we see further evidence of VDOE’s wholesale cooking of the accreditation data.

The math data paint a similar picture.

image

The Accreditation Benchmark here is 70% and only Hill comes even close.  Note also that the “accreditation denied” Thompson has whipped the “accredited with warning” Elkhardt, Henderson, and King for all four years of the new math tests.

It should be a crime to subject any child to any of these schools.  It is an embarrassment.

Richmond Elementary Schools

Looking further into the SOL pass rates, here are the reading data for Richmond’s elementary schools by year:

image

That graph is much too complicated but it does contain all the data, plus the Richmond average.  Also, the graph does illuminate the disastrous effect of the new tests in 2013 and the relative abilities of some schools to weather the change (notably Carver(!), Munford, and Fox, and, with a different pattern, Fairfield) and the gross failures of others (notable Mason, Oak Grove, and Woodville).  Doubtless this tells us a lot about the principals of those schools, especially in light of the more challenging student populations at Carver and Fairfield.

Focusing on the high and low performers, plus our neighborhood school (with the State average for grades 3-5 added) gives a more readable picture.

image

The math scores paint a similar picture, except that the Big Hit from the new tests came a year earlier.

image

image

VCU: Expanding Upon Incompetence

The other day, Jim Bacon published a piece on the Brookings study, Beyond College Rankings: A Value-Added Approach to Assessing Two- and Four-Year Schools.

Brookings looked at “economic value added.”  In essence, they measured “the difference between actual alumni outcomes (like salaries) and predicted outcomes for institutions with similar characteristics and students.”

The Brookings data have something to tell us about the jobs our colleges and universities are doing.

Here, for a start, are the Brookings data for Virginia institutions.

image

If we graph those data, we see a clear trend.  Indeed, it makes sense that mid-career salary would correlate with occupational earnings.

There are two clearly anomalous data points: the University of Richmond (the yellow square) and Mary Washington (the pink circle) are well removed from the trend.

image

Richmond is the only “Baccalaureate College[]” in the list, albeit it offers advanced degrees in law, education, and other subjects.  Presumably Brookings looked only at the undergraduate program.  As to why Richmond graduates would have relatively high mid-career salaries but low earnings, go figure.

If we just look at the research universities, we see a clearer picture: Outstanding performance, except from VCU, and a clear trend.

image

image

Unfortunately, VCU seems devoted to its mediocre performance.  They are spending $90,000 per year of your and my tax money to hire Richmond’s failed Superintendent as as an Associate Professor in “Educational Leadership.”  If that kind of “leadership” is characteristic of VCU, it’s a miracle that their graduates’ earnings are not still lower.

Turning to the schools with masters programs:

image

image

Here we see another fairly clear trend, with Mary Washington and Hampton showing anomalous mid-career salary outcomes.

My takeaway: Enjoy VCU basketball and send your kids to another university.

————————————————————

Note added Wednesday afternoon: The inimitable Carol Wolf pointed out that VUU VSU [Oops!] beat VCU as to both earnings and mid-career salary.

Indeed.  VUU VSU in green, VCU in red:

image

In fact, it’s more dramatic than that.  VCU is a research university, while VUU VSU is in the lower-scoring cohort of “Master’s Colleges and Universities.”  It would have been entirely unremarkable if VCU had significantly outscored VUU VSU.  It is entirely remarkable that VUU VSU whipped VCU in both rankings.

Reversible “Progress”

The ratty, old steel guardrails on Riverside Drive did a fair job of standing the tests of time and drunks.  The new, wood guardrail got (and failed) its first test about 02:00 Friday morning on the curve just west of the 42d St. parking lot.

The folks who installed the railing did a nice job of setting it straight; our Friday morning visitor undid that straightness,

150419 004

mostly by moving the posts.

150419 005

150419 006

If our City repairs the new guardrail with the same care that they bring to picking up the leaves on Riverside Drive, we are in for a deteriorating wooden eyesore.

150419 001

Your tax dollars at “work.”

“Educator” = “Criminal”??

The Wall Street Journal this morning headlined “Eleven Atlanta Educators Convicted in Cheating Scandal.”

The story reported that eleven of twelve former administrators, principals, and teachers were convicted of racketeering for their participation in a conspiracy to cheat on their students’ standardized tests.

Looks like the WSJ couldn’t think of a better word than “educator.”  They might have said “former public school personnel.”  “Criminal” would have been even more compact.

For sure, calling those people “educators” was a slam to all the decent folks who work in the public schools.