SOL v. Cost

Table 13 in the Superintendent’s Annual Report lists annual disbursements by division.  Unfortunately, we only have the 2014 data; the current data ordinarily don’t come out until the following Spring.

Deleting the facilities, debt, and contingency entries, and juxtaposing the resulting disbursement totals with the 2015 Reading SOL Pass rates, produces the following graph.


Richmond is the gold square.  The red diamonds are, from the left, Hampton, Newport News, and Norfolk.  Thus we see the comparable old, urban jurisdictions performing poorly at about average cost while Richmond’s reading performance is much worse at a much higher cost.

The datum up there at $11,127, 23% less expensive than Richmond, is West Point, with an 87.8% pass rate.

The R2 value of 2.3% tells us that, among the Virginia school divisions, reading performance and cost per student are essentially uncorrelated.

The math data paint a similar picture.


The division pass rates again fail to correlate with expenditure. 

The point up top ($11,127, 89.0%) is West Point, again.

These data say, quite clearly, that Richmond’s education establishment should stop whining about money and start educating the City’s children.

New SOL Data, Continued . . .

The excuse we often hear for Richmond’s poor performance on the SOL tests is poverty.

VDOE has data on that.  They define a student as “economically disadvantaged” if that student “1) is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible for Medicaid, or 4) [is] identified as either Migrant or experiencing Homelessness.”  Data are here.

Juxtaposing the 2015 Division pass rates with the ED percentage of the enrollment, we see the following for the reading tests:


With an R2 of 0.5, it appears that ED is a reasonably good predictor of Division reading pass rates. 

Richmond is the gold square on the graph.  The red diamonds are the comparable old, urban jurisdictions: From the left, Hampton, Newport News, and Norfolk.  The yellow points are the outstanding performers: From the left, West Point, Wise, Norton, and Highland.  Notice that Norton and Highland are outperforming about as much as Richmond is underperforming, with about the same level of poverty.

Turning to the math tests, the correlation drops but the pattern is much the same:


Richmond again is the gold square; the red points again are Hampton, Newport News, and Norfolk.  Norton drops out of the yellow outperforming group, leaving West Point, Wise, and Highland.

Looks to me like Richmond needs a better excuse than poverty.

SOL Scores are Up. Richmond Scores are Up Some.

VDOE today posted the 2015 SOL data.

“Wait a minute,” you say.  “They had those data in time to schedule graduations last May.  Why did they wait until now to publish them?”

Good question.  Their excuse is that the summer testing data are not available until now.  The reason (one suspects; they hide the data so we can’t actually know) is that they manipulate these data extensively and all that futzing takes time.

Quick Background

The new math tests in 2012 and the new reading tests in 2013 lowered the pass rates statewide and clobbered the Richmond rates.  The Richmond recovery was delayed because our former Superintendent failed to align the curricula to the new tests.  This provided our new Super a nice opportunity to shine. 

First Look at the Data

A first look at the 2015 data suggests that it was more of a glowing than a shining:


Compared to some peers and a neighbor:



Stay tuned for more analysis.

Car Breakins in Forest Hill

An email yesterday from The Association told us about a Jahnke Road, Forest Hill meeting responding to some robberies over there.


The email also mentioned an increase in car breakins in our neighborhood.  I went to the Police Dept site looking for some specifics.

The data as of July 27 show nice overall decreases in total offense reports and in the most frequent of those, car breakins.



Notice the summertime spikes in both total reports and in car breakins.  I think the two are related in that the car breakin opportunities chum the neighborhood for lowlives whom we really don’t want to have hanging around our homes.

For the period of the database, we see the average number of reports dropping from almost fifteen a month to about four, with the car breakins decreasing from about six to about one.

There’s been a small spike in both measures this summer.

The 4200 block of Riverside Drive maintains its “lead,” primarily because of car breakins at the Park.


Here are the top blocks for total reports and then for car breakins (the percentages are block/neighborhood; the time period is January, 2000 to July 26, 2015):



The numbers have decreased nicely since the City blocked the back row of parking spaces in the 42d St. lot (the ones not visible from the road) and started locking the lot on weekdays and at night.  As well, RPD has been helpful in watching both the 42d St. and 41st. St (Hillcrest) lots.



Turning to the top blocks:

4200 Block Riverside Drive:

Car breakins and the attendant property destruction account for almost half of the reports in this block.


4400 Block Forest Hill Ave.:

The major source of disorder here is the nursing home, rather than the Park.


4100 Block Riverside Drive

Back to the Park and car breakins.



4700 and 4800 Forest Hill Ave.

These commercial blocks show an entirely different (and more violent) pattern, as we might expect.  These numbers are low, however, because they include reports only from the FHNA area, i.e., the north side of Forest Hill Ave.

Well, the City Planning people think those blocks are in Westover Hills.


But the Police database puts them in Forest Hill.  Either way, we shop there.

First the 4700 block:


And then the 4800 block:


Sermon Begins

We live in a quiet neighborhood and it’s been getting quieter.  But we’re just over half through 2015 and we already have as our car breakin quota for the year.


Car breakins are perhaps the ONLY crime that is completely preventable.  As RPD says on the flyers they put under windshields on Riverside Dr. this Spring: “Put your junk in your trunk.”

Our neighbors have mostly learned this lesson.  The kids who go to the Park on warm, sunny afternoons have not.  It’s time for some signs like those at Maymont:


The small sign on the backside of the kiosk at the 42d St. lot is too little too late: Those kids need to be warned before they leave the car.

People to talk to about this:

VCU: Expanding Upon Incompetence

The other day, Jim Bacon published a piece on the Brookings study, Beyond College Rankings: A Value-Added Approach to Assessing Two- and Four-Year Schools.

Brookings looked at “economic value added.”  In essence, they measured “the difference between actual alumni outcomes (like salaries) and predicted outcomes for institutions with similar characteristics and students.”

The Brookings data have something to tell us about the jobs our colleges and universities are doing.

Here, for a start, are the Brookings data for Virginia institutions.


If we graph those data, we see a clear trend.  Indeed, it makes sense that mid-career salary would correlate with occupational earnings.

There are two clearly anomalous data points: the University of Richmond (the yellow square) and Mary Washington (the pink circle) are well removed from the trend.


Richmond is the only “Baccalaureate College[]” in the list, albeit it offers advanced degrees in law, education, and other subjects.  Presumably Brookings looked only at the undergraduate program.  As to why Richmond graduates would have relatively high mid-career salaries but low earnings, go figure.

If we just look at the research universities, we see a clearer picture: Outstanding performance, except from VCU, and a clear trend.



Unfortunately, VCU seems devoted to its mediocre performance.  They are spending $90,000 per year of your and my tax money to hire Richmond’s failed Superintendent as as an Associate Professor in “Educational Leadership.”  If that kind of “leadership” is characteristic of VCU, it’s a miracle that their graduates’ earnings are not still lower.

Turning to the schools with masters programs:



Here we see another fairly clear trend, with Mary Washington and Hampton showing anomalous mid-career salary outcomes.

My takeaway: Enjoy VCU basketball and send your kids to another university.


Note added Wednesday afternoon: The inimitable Carol Wolf pointed out that VUU VSU [Oops!] beat VCU as to both earnings and mid-career salary.

Indeed.  VUU VSU in green, VCU in red:


In fact, it’s more dramatic than that.  VCU is a research university, while VUU VSU is in the lower-scoring cohort of “Master’s Colleges and Universities.”  It would have been entirely unremarkable if VCU had significantly outscored VUU VSU.  It is entirely remarkable that VUU VSU whipped VCU in both rankings.

Reversible “Progress”

The ratty, old steel guardrails on Riverside Drive did a fair job of standing the tests of time and drunks.  The new, wood guardrail got (and failed) its first test about 02:00 Friday morning on the curve just west of the 42d St. parking lot.

The folks who installed the railing did a nice job of setting it straight; our Friday morning visitor undid that straightness,

150419 004

mostly by moving the posts.

150419 005

150419 006

If our City repairs the new guardrail with the same care that they bring to picking up the leaves on Riverside Drive, we are in for a deteriorating wooden eyesore.

150419 001

Your tax dollars at “work.”

“Educator” = “Criminal”??

The Wall Street Journal this morning headlined “Eleven Atlanta Educators Convicted in Cheating Scandal.”

The story reported that eleven of twelve former administrators, principals, and teachers were convicted of racketeering for their participation in a conspiracy to cheat on their students’ standardized tests.

Looks like the WSJ couldn’t think of a better word than “educator.”  They might have said “former public school personnel.”  “Criminal” would have been even more compact.

For sure, calling those people “educators” was a slam to all the decent folks who work in the public schools.

A Modest Proposal

SOL scores decrease with decreasing economic status of the family.  Thus, the Feds have required (select the SGP Primer link) VDOE to compute a measure of learning, not family income.  VDOE selected the SGP.  VDOE now has three years’ of those SGP data that can be used to measure teacher effectiveness. 

VDOE has a lawyer full of bogus excuses for not releasing the data with the teachers’ identities attached.  None of those would prevent use of the data to rate the effectiveness of the colleges that sent us those teachers.

Just think, VDOE now can measure how well each college’s graduates perform as fledgling teachers and how quickly they improve (or not) in the job.  In this time of increasing college costs, those data would be important for anyone considering a career in education.  And the data should help our school divisions make hiring decisions.

In addition, VDOE could assess the effectiveness of the teacher training at VCU, which is spending $90,000 a year of your and my tax money to hire Richmond’s failed Superintendent as an Associate Professor in “Educational Leadership.”  Wouldn’t it be interesting to see whether that kind of “leadership” can produce capable teachers (albeit it produced an educational disaster in Richmond).

Categories SGP

Lynchburg SGP

My paternal grandmother was Angie Lynch, said to be a relative of John Lynch.  Angie was the second woman in the Oklahoma territory with an advanced degree.

I’ve maintained an affection for Lynchburg, especially in celebration of the US 460 bypass that makes travel to Roanoke a much lighter task.  So it was a particular sorrow when my earlier Lynchburg post got wiped.

In light of VDOE’s third data release (that includes data by teacher, but not by school), I thought I’d redo the post.

First, as a reminder, here are the statewide distributions of teacher average SGPs in reading and math.



Next the Lynchburg distributions.




Need I say it: These are not good numbers.

We have three years’ data so let’s look at the trends, restricting the graphs to those teachers who taught the subject for all three years.


There are too many reading teachers to make much sense of the graph (the table on the right is too small to even list them all).  Let’s take out all but the top and bottom few.


Here we see some average and below average teachers improving nicely (No. 66197 presents a happy picture) and others deteriorating severely (No. 69532 is an unfortunate counterbalance to No. 66197).  The citywide average by teacher (that includes all the teachers, including those who taught reading for only one or two years) is low and the lack of a trend does not suggest improvement.

Going directly to the bowdlerized dataset, the math data are more lively.


Of interest, we again see low-performing teachers whose performance deteriorated.  We also see a citywide average that bounced but then dropped back to subpar.

Only three Lynchburg teachers taught Algebra I all three years so the graph is much simpler.


None of the three improved over the period; quite the contrary.  The average is pulled down by the teachers, not shown, who taught fewer than all three years.  It starts above the state average but deteriorates into the unacceptable range populated by Lynchburg’s reading and math averages.

We also have detailed data by teacher, albeit VDOE won’t tell us who they are.  The high-performing teacher in this collection is No. 71485, who had only one 4th grade math student scoring below the statewide average.


In contrast, the best math SGP in the 4th grade class of Teacher No. 71819 was 23.


This teacher also had a 4th grade reading class.


The 25.7 average in that reading class is far from acceptable but it is far less dismal that the 4.4 average in this teacher’s math class.

For any reader inclined to overlook the fundamental notion that the SGP measures teacher performance, a glance at the eight students that took both reading and math from this teacher is instructive.


One student scored in the same Student Growth Percentile in both subjects; the other seven scored higher, some much higher, in reading.  Note especially student No. 7B89048849408, who scored in the first percentile with the benefit of this teacher’s math instruction but in the 70th on the reading test.

Unfortunately, this teacher is getting worse.


I could go on but I think these data make my points.  I’ll suggest five things:

  • Lynchburg has a problem with its school system.
  • No. 71819 is an awful math teacher.
  • No. 71819 is a very bad reading teacher.
  • Any principal who subjected schoolchildren to No. 71819 in 2015 should be fired.
  • The bureaucrats at VDOE who refuse to identify No. 71819, as well as that teacher’s principal, to the parents of Lynchburg are misusing the public funds that pay them and pay for the statewide testing.