Problem in Paradise: Theft from Motor Vehicle in Forest Hill

The daffodils are budding in the alley and I still haven’t updated the Forest Hill crime report data.  Let’s get to work.

First the geography: In the Police Dept. database, the neighborhood runs from the park to the Boulevard and from Forest Hill Ave. to the river:

As you see, this does not include all of the Forest Hill Neighborhood Ass’n area and does include some of the Westover Hills Ass’n area.  It includes only one side of Forest Hill Ave, notably only one side of the commercial area.

Microsoft has a nice view of the area.

For the period from the start of the Police Department Database, January 1, 2000, through December 31, 2018, that database contains 3,081 offense reports for the Forest Hill neighborhood. 

The database contains lots of duplicates.  In this case, 999 of the entries duplicate the incident number, offense code, and offense number of another incident, leaving 2,082 unique entries.

Among that 2,082 entries, “theft from motor vehicle” is the most common at 27%.

image

I like to call those incidents “car breakins” but that is not accurate: Most of those are cases where park visitors left the car unlocked.  The count of “Destruction property/private property” incidents gives a high but approximate measure of the actual breakins.  “Abandoned property in car” might be more accurate.

As usual in a quiet neighborhood, most of the incidents involve property crime.  In the present case, the most frequent violent crime is simple assault, in tenth place behind 69% of the total (ninth place, 67%, if we don’t count the 56 natural deaths).

The neighborhood was enjoying a consistent pattern of improvement until 2015.

image

The increases then were driven by increases in theft from motor vehicle.

image

By far our worst block for crime is 4200 Riverside Drive.

image

That block is home to the 42d St. Parking Lot.

image

Half of the crime reported in the block is theft from motor vehicle, with second place going to the property destruction where the thief had to break in because the car was locked.

image

No telling how much of the rest is spillover from the criminals lured into our neighborhood by the unlocked cars.

The earlier decreases in the 4200 block came after Parks’ 2005 response to neighborhood complaints: They started locking the gates to the 42d. St. lot at night and off season and they installed rocks to block parking in the part of the lot that is less visible from the street.

image

I attribute the recent increases to the increased use of the Park, the removal of the rocks in 2016, and the reassignment of Stacy, the bicycle cop.

Aside from 4400 Forest Hill (mostly the nursing home) and 4700-4800 Forest Hill (the commercial area), the other blocks at the top of the list are high theft from motor vehicle blocks:

image

image

There are two lessons here:

  • Leaving stuff in the car, especially in an unlocked car, is an invitation to lose the stuff and to help chum the neighborhood for criminals; and
  • Given that most of the thefts are from the vehicles of park visitors, it’s past time for some LARGE signs in the 4200 block and at the Nature Center and 4100 Hillcrest and, especially, in the 42d St. parking lot, to warn the visitors:
                                             Car Breakins Here! Lock your junk in your trunk.

If you share my view on that, please contact our Councilwoman and fill in the Park Master Plan Questionnaire.  Here is what I said there:

image

College Graduation Rates

The SCHEV Web site has cohort graduation rates for first-time, full-time freshmen in our four-year institutions.  Because the data include the six-year rate, the most recent report is for the cohort that entered in 2012-13.

image

The numbers in the green bars are the 4-year rates; those outside the end, the 6-year.  You’re on your own for the five-year rates.

Notice JMU and University of Richkids approaching the W&M/UVa rates and notice especially W&L leading the pack.

If we calculate the ratio of the 4-year to 6-year graduation rates, we get:

image

The lower average rate for the public schools reflects the relatively larger 5- and 6-year rates at the urban schools, the former teachers’ colleges, and Tech.

Comcast vs. the Neighborhood

Returning home from the gym yesterday, I found the alley blocked by a vehicle.

20190222_104752

Judging from the sign on the side of the vehicle, the driver is a Comcast contractor (without the required Virginia plates, it seems).

20190222_104759

Getting no response from a polite honk of my horn, I resorted to a more robust sounding.  That produced the driver, walking down the (empty) driveway of the house to the right and rear of this picture.

20190222_104826

I asked if he would move the car.  He said no.

I turned around and drove around the block to get to the other end of the alley, thinking unkind thoughts about Comcast.

This kind of behavior does not improve Comcast’s (already less than wonderful) reputation.

Richmond: More Money, Worse Schools: Update

An earlier post discussed our Superintendent’s false statement about financial support.  That post was based on the 2017 VDOE data, the latest funding data then available.  I’ve updated the post with the 2018 numbers that VDOE just posted.

Our school Superintendent wrote an op-ed for the Times-Dispatch complaining that:

Virginia’s highest poverty school divisions — which serve large percentages of children of color — receive 8.3 percent less in per-pupil funding than the state’s wealthiest districts. Put plainly: The students who should be getting more are actually getting less.

In fact, Virginia’s high poverty divisions (larger numbers of economically disadvantaged students) actually spend more per pupil on average than the more affluent divisions.

image

Richmond, the gold square on the graph, spends $2,219 more than average per student; indeed, it is the fourteenth biggest spender (of 132).

Table 15 in the (State) Superintendent’s Annual Report permits a look into the sources of those funds that the Richmond schools are spending.   

The table breaks out division receipts by source:

  • State Sales and Use Tax (1-1/8 % of the sales tax receipts);
  • State Funds (appropriated by the Generous Assembly);
  • Federal Funds (direct federal grants plus federal funds distributed through state agencies); and
  • Local Funds (local appropriations).

Let’s start with a graph of the per student spending of state and federal funds vs. the division percentage of economically disadvantaged (“ED”) students:

image

The immediate lesson here is that Superintendent Kamras is simply wrong about high-poverty schools being starved for outside funds: The sales tax funding is essentially flat v. % ED while the state appropriations and the federal funding increase with increasing % ED.

Richmond, the gold squares, is a bit low in terms of state funding, but that deficit is offset (and then some) by federal money.  Richmond’s sales tax funding, $1,071 per student, is hidden in the forest of other schools, almost exactly on the fitted line.

Only in local funding can we can find any hint of support for the notion that divisions with more poor students receive less money.

image

Of course, it is no surprise that less affluent jurisdictions might provide fewer funds to their school systems.  For the most part, they have less money to spend on any governmental operation. 

Kamras’ own division, with a 61% ED population in its schools, nonetheless came up in 2018 with $1,806 in local funds per student more than that fitted line would predict. 

As well, when all the fund sources are added in, the spending on education increases with increasing populations of disadvantaged students: See the graph at the top of this post.

In summary, Richmond schools received LOTS of money in these categories:

image

[“Predicted” values here are calculated from the Richmond % ED and the fitted lines in the graphs above.  The sum of the predicted values is seven dollars less than the value calculated from the actual total, which is probably explained by the inclusion of tuition from other divisions in the total reported.]

So, when he says “The students who should be getting more are actually getting less,” our Superintendent is wrong.  And, even more to the point, Kamras’ own schools are enjoying much more than average financial support.

The Kamras op-ed is a bald attempt to excuse the ongoing failure of the Richmond public schools to educate Richmond’s schoolchildren.  For example, on the 2018 reading pass rates:

https://calaf.org/wp-content/uploads/2018/09/image-33.png

The excuse is contradicted by reality: Those Richmond schools are swimming in money.  Even more to the point, the performance of Virginia school divisions is unrelated to how much money they spend.  For example:

image

image

Richmond again is the gold square, spending lots of money and getting lousy results.  For comparison, the peer cities are the red diamonds: from the left, Hampton, Norfolk, and Newport News.  As a courtesy to my readers there, the violet diamond is Lynchburg, the green, Charles City.

It would be helpful for our Superintendent to turn his energy to improving the performance of our schools and to stop misleading the public about the reasons those schools are so ineffective.

If At First You Don’t Succeed, Change the Subject

On Sunday, The Times-Dispatch published an op-ed in which Superintendent Kamras decried “institutional racism” and suggested that the first step to deal with it would be more money for divisions with higher poverty rates.  Such as Richmond, of course.

Kamras wrote:

According to the National Center on Education Statistics, Virginia’s highest poverty school divisions — which serve large percentages of children of color — receive 8.3 percent less in per-pupil funding than the state’s wealthiest districts. Put plainly: The students who should be getting more are actually getting less.

The estimable Jim Bacon pulled some data from the VDOE Web site to question that proposition.  The full dataset makes an even stronger case than the one Bacon argued.

The Superintendent’s Annual Report includes at Table 13 a list of division disbursements.  The latest data there are from 2017.  The VDOE Web site also sports a (very nice) database that provides “membership” data by race and by “disadvantage” (primarily free and reduced lunch numbers), inter alia.

Excel is happy to juxtapose the datasets on a graph.  Let’s start with economic disadvantage (“ED”).

image

Note:  The disbursement data here are division totals, less spending for facilities, debt service, and contingency reserve.

The fitted line suggests that divisions with more ED students spend more per student ($244 for a 10% increase in ED enrollment) but the 2.3% R-squared value tells us the two variables are very weakly correlated.

In any case, there is no pattern here of deprivation of those divisions with large ED populations.  Quite the contrary, most of the Big Spenders are high poverty divisions. 

As well, we see Richmond, the gold square, spending lots of money.  Indeed, Richmond is the tenth biggest spender among the 132 divisions.

image

We can argue about the reasons for the lousy performance of Richmond’s public schools, but lack of money is not a candidate.

Of course, Kamras was talking about race, not poverty.  There is no need here to accept his undocumented melding of those two factors; the VDOE database also has the division membership by race.

image

Note: Bland and Highland are absent from this dataset, presumably because their black enrollments are small enough to trigger the suppression rules.

Again, the slope is in the wrong direction for the Kamras complaint ($275 increase per 10% increase in the black student population).  And this time the 7.7% R-squared value hints more strongly at a correlation.

The absence of Highland County from the list moves Richmond up to ninth place (of 130 divisions).

image

Bottom line: The VDOE data contradict Kamras’ claim that divisions with more poor or more black students are under funded vis-à-vis the other divisions.

More fundamentally, Kamras’ jeremiad about funding overlooks the abundant data that show no relationship between division funding and SOL performance.  Money is not the problem in Richmond’s schools; lousy schools are the problem.

Looking at the study he cites, it appears that Kamras is complaining about the funds Virginia schools receive [Table 7] in an arbitrary grouping, not what each division spends.  Whatever that study may mean, it cannot contradict the Virginia data that show Richmond and other high-poverty (and high black student percentage) divisions spending about as much money as their more affluent peers.  And, for sure, those school systems can’t spend more than they receive.

Indeed, Kamras’ division is spending a lot of money and getting lousy results.  It would be helpful for our Superintendent to spend more energy improving the performance of his schools and less on misleading the public about the reason those schools are so awful.

Graduation Rates: Official Fiction

The estimable Carol Wolf sent me the link to an article reporting improving graduation rates of disabled students and asked whether that were reflected in Virginia or Richmond.

We earlier saw that Virginia’s graduation rate has been increasing while the reading and math End of Course pass rates were falling.  That is, the Board of “Education” has its thumb on the statistical scale.  Per Carol’s’ inquiry, let’s take a closer look at the overall rates and delve into the rates for disabled students.

The current requirements for a “standard” diploma include six “verified” credits, two in English plus one each in math, a laboratory science, history & social science, and a student-selected subject.  To earn a verified credit, the student must both pass the course and pass the End of Course (“EOC”) SOL test “or a substitute assessment approved by the Board of Education.”

[Do you see the thumb there on the scale?]

To start, here are the reading EOC pass rates for the past five years.

image

Hmmm.  How might we explain those Richmond disabled numbers for 2014-16?  Friar Occam might suggest cheating.  In any case, this is not a picture of improving performance.

Then we have writing.

image

History & Social Science.

image

Math.

image

And science.

image

There are some bumps and squiggles there but the trends are clear: The state averages are fading and the Richmond, plunging. 

The five subject average smooths out the variations.

image

That’s clear enough: The statewide averages have declined in the last two years; despite some gains in ‘15 and ‘16, those averages have declined overall since 2014.  The Richmond averages have plummeted.

Turning to diplomas: Our educrats report (and brag upon) an “on-time” graduation rate.  To get that rate they define “graduates” to include students who earn receive any of the following diplomas: Advanced Studies, Standard, Modified Standard, Special, and General Achievement

To their credit, the federales do not count the substandard diplomas: The federal rate includes only advanced and standard diplomas.  To combat that bit of realism, Virginia two years ago redefined the Modified Standard Diploma by allowing “credit accommodations” to transform it, in most cases, into a Standard Diploma.

This had a nice effect statewide and a dramatic effect in Richmond.

image

image

With that background, let’s look at the four-year cohort graduation rates.

image

Those increases are enough to warm an educrat’s heart, at least until we notice that:

  • The pass rates don’t support the recent increases, and
  • That 2017 bump in the disabled rate (that boosts the overall rate is some measure) reflects 1,200 or more modified standard diplomas that were transformed into standard diplomas by fiat.

The redefinition give Richmond a nice bump in 2017 but the overall rate resumed its decline in 2018.

image

So, yes, Carol.  The Virginia four-year cohort graduation rates rose, both for disabled students and for all students.  The rise was enhanced after 2016 by (even further) manipulated data.  The rise continued at a time the pass rates in the End of Course SOL tests were declining.  If you believe those improving numbers, I want to sell you some shares in a nice James River bridge.

Richmond’s declining numbers remind us that even bogus statistics can’t make Richmond’s public schools look like anything but a menace to the students.

Your tax dollars at “work.”

Postscript: It looks like inflated graduation rates are a national phenomenon.

Boosting the Graduation Rate

As VDOE bragged, their (bogus) “On-Time” graduation rate rose this year.  They didn’t brag about the Richmond rate; it dropped.

Turns out, the (somewhat less bogus) “federal” rates show the same pattern.

image

Note: Post updated from October, 2018 to correct the 2017 state rate.

The increase in the state rate was driven by an increase in the standard diploma rate.  The drop in the Richmond rate came from a 2.7% decrease in the advanced studies rate.

image

But you, Astute Reader, are still looking at that first graph and asking: “John!  You’ve been ranting about how VDOE’s manipulation improved the state rate by about 1.3 points for ‘17 and ‘18 and the Richmond rate by perhaps five points.  Where are those increases?”

Ah, what a pleasure to have an attentive reader!  The 1.3 and 5 point boosts look to have been offset or partially offset by decreases in the End of Course pass rates. 

Turning to the data, here are the graduation rates again along with the averages of the five EOC subject area pass rates.

image

Students must pass six EOC tests to graduate.  Thus, the decreases of the pass rates of those required courses must have lowered the graduation rates.  Then the VDOE data manipulation offset those graduation rate declines in some measure. 

That looks like a general explanation.  The specific would require a more detailed knowledge of which students passed or failed which courses, and where in their four-year journey through high school, and whether they graduated.  For sure, the drop in the Richmond pass rates is consistent with the absence of the five-point boost there.

Of course, correlation is not causation and doubtless there are other factors in the mix here.  The floor is open for any more convincing suggestion.

BTW: The Big Drops in Richmond, and the lesser in state, EOC pass rates mostly came in math and science.

image

image

Preview of Coming Attraction: The Board of “Education” has its next Finagle factor in place: Under the new accreditation regulation (subsection B.3), we now have “locally verified credits” for students who flunk the required SOL tests.  This should insure another nice increase in the state graduation rate, paid for by another not-so-nice decrease in student learning.

Punishing Poverty

Our Board of “Education” has engineered a reporting system
that rewards the more affluent divisions and penalizes the poorer ones.

We have seen that economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) in terms of pass rates on the SOL tests.  In 2018, that underperformance ranged from 17.43% on the History & Social Science tests to 21.82% on the writing tests.

image

This places schools and divisions with larger ED populations at an unfair disadvantage with respect to the SOL averages.

Here, to start, is a selection of reading pass rates.

image

I’ve selected these nine divisions because their ED populations are spaced at about 10% intervals.  (They are, from the left, Falls Church, Powhatan, Arlington, Gloucester, Bath, Dickenson, Roanoke City, Northampton, and Greensville.)  The orange diamonds are the division average reading pass rates of the Not ED students; the green triangles are the ED pass rates. 

The SOL average pass rates are the blue circles.  The divisions with smaller numbers of ED students to drag down that average have higher SOL numbers.  Indeed, the divisions with fewer than 51% ED (the statewide average) enjoy a boost; divisions with more than 51% suffer a penalty.

The extreme examples are Falls Church and Greensville County (which includes Emporia).

Falls Church had the lowest ED numbers in the state, 9%.  The reading pass rate for their Not ED students was 95%; for ED, it was 67% (a 28 point spread; no bragging rights there).  The average of the ED and Not ED averages was 81%, which would have been a fair measure of the performance of both groups.  The division SOL pass rate, however, was 92% because of the small number of EDs.  Falls Church thus enjoyed an eleven point SOL bonus for affluence.

Greensville County was the other end of the scale, 93% ED.  The pass rates were lower: 76% for Not ED, 59% for ED, a seventeen point spread with a 68% average.  But the reported SOL was 61%.  Greensville suffered a 7 point SOL penalty for poverty.

For a more complete view of the situation, here is a graph of division pass rates, Not ED, ED, and SOL, all plotted v. the ED percentage.

image

The Falls Church points are enlarged and filled with purple; Greensville, maroon; Richmond, gold.

Let’s simplify the picture by looking just at the least squares fitted lines:

image

The statistics of the fitted lines tell three different stories:

  • ED: The fitted line shows an 0.43% decrease in the ED pass rate for a 10% increase in the ED population but the correlation is minuscule.
  • Not ED: The slope is minus 1.35% for a 10% increase in ED and there is something of a correlation.  It looks like increasing the ED population is mildly related to a decrease in the Not ED rate, but not so much the ED (this kind of data can’t show whether the ED population increase causes part of the Not ED pass rate decrease).
  • The SOL pass rate decreases by 2.77% for a 10% increase in the ED population.  Thirty-five percent is a pretty good correlation and the reason is obvious: Divisions with larger ED populations have more ED (i.e., lower) pass rates included in the average.

The math data tell much the same story.

image

Here the average of the Falls Church ED/Not ED pass rates is 74% but the small ED population results in an SOL pass rate 13 points higher.  The Greensville SOL reflects a 3 point penalty vs. the ED/Not ED average, which is that small only because the ED and Not ED rates are only seven points apart.

Notice that the Falls Church ED rate again is far below the Not ED, here by 33%, which is double the state average difference.  Either those Falls Church schools are coasting with unusually bright Not ED students, or struggling with unusually low-performing ED students, or doing a poor teaching job with the ED group, or some combination of such factors.

image

The slopes again show how the SOL average penalizes divisions with large ED populations.  As well, the R-squared for the Not ED group suggests, even more strongly than in the reading data, that increasing the ED population is associated with an effect on the performance of the Not ED group.

We cannot infer from the data why the Board of “Education” would embrace this system that punishes poverty.  We can notice, however, that the system is unfair on its face. 

Indeed, the Board had a fair measure of learning, the SGP, that was independent of poverty.  But the Board abandoned that system on the flimsy excuse that it could not calculate the results until Summer School results were in.  In fact, they knew that when the started calculating the SGP.  As well, they were (and are) perfectly capable of calculating the SGPs in May for the students (and teachers) not involved in Summer School.  Waiting until August merely gives them a chance to camouflage some of the poor performances during the regular school year.

Your tax dollars at “work.”

Here, for the record, is a list of the divisions that received more than a 2% reading SOL boost from the ED/Not ED average in 2018:

image

And here are those that enjoyed a penalty of 2% or more:

image

Ah, well, we can make one inference:  If you’re that Board and you’re going to make an enemy, better the Superintendent in Greensville or Colonial Beach than the one in Falls Church or Loudoun.

Don’t Blame the Kids, Chapter MMXIX

The previous post shows (yet again) that lack of money does not explain Richmond’s awful SOL pass rates.

That post, however, left open the question whether Richmond’s relatively large population of economically disadvantaged (“ED”) students might explain Richmond’s poor performance, at least in part.  The VDOE database has the pass rate data to answer that question.

Let’s start by looking at the division average reading pass rates for both the ED and Not ED groups as functions of the division expenditure in excess of the Required Local Expenditure.

image

The least squares line fitted to the Not ED pass rates shows an increase of 1.15% for a 100% increase in the excess expenditure but the correlation is close to vanishingly small.  The line fitted to the ED pass rates in fact shows a decrease of 3.3% per hundred but still with a correlation that would not support a conclusion anywhere except, perhaps, in a sociology thesis.

These data are consistent with the earlier analysis that showed no benefit to ED or Not ED pass rates from increased per student day school expenditures, increased instructional salaries, or increased per student number of instructional positions.

In the Not ED data, Richmond is the larger diamond with the gold fill at a pass rate of 74.3%, fourth worst in the state.  The Richmond ED rate, the larger circle with, again, the gold fill is 51.7%, second worst in Virginia.   In the previous post, Richmond’s overall SOL pass rate was 58.9%, third worst.

In short, large ED population or not, Richmond is doing a terrible job of teaching all its students.

For reference, the peer cities are in red fill, from the left Norfolk, Newport News, and Hampton.  Charles City is green; Lynchburg, blue.

Eight of the ten highest-priced divisions beat the state reading average for Not ED students (green fill in the table below); four beat the ED average.

image

Of the ten lowest excess divisions, six beat the average for Not ED students while nine beat the ED average.

image

As to the divisions of interest, here are their pass rates expressed as differences from the averages of the division pass rates.

image

The data for the other subjects are consistent with the conclusion that Richmond’s school are ineffective, for both ED and Not ED students.

image

image

Notice that I had to expand the axis to capture Richmond’s tied-for-worst ED writing pass rate.

image

image

image

image

image

image

The next time RPS starts (well, continues) to whine about needing more money for instruction, please be sure to ask them exactly what that money would buy, given that more money in other divisions doesn’t seem to buy anything but more taxes.

More on Money and Learning

A helpful reader points out that the 2018 Required Local Effort report is up on the Web.

Va. Code § 22.1-97 requires an annual report showing whether each school division had provided the annual expenditure required by the Standards of Quality.  The report presents the data as “required local effort” (“RLE”) per school division, along with the actual expenditure, both in dollars and as the excess above the RLE.

Note: The report breaks out the data for all divisions but the SOL database consolidates Emporia with Greensville County, James City County with Williamsburg, and Fairfax City with the County.  I’ve calculated those three consolidated RLE’s from the individual division data.

To start, here are the 2018 division reading pass rates plotted vs. the expenditures beyond the RLE.

image

The slope of the fitted line would suggest that the average reading pass rate increases by 1.17% for a 100% increase in the excess RLE.  The correlation, however, is minuscule.

Translating the statistics-speak into English: The division excess expenditure does not predict the reading pass rate.

Richmond is the gold square on that graph.  The peer cities are the red diamonds: From the left, Norfolk, Newport News, and Hampton.  Charles City is the green diamond; Lynchburg, the blue.

The Big Spenders, yellow circles on the graph, are:

image

For the most part, those divisions performed well (green shading for those > the average).

The Little Spenders, the red circles on the graph, are:

image

With the exception of Petersburg, those divisions also performed well, or close to it.  Notice that six of the ten are in Southwest Virginia and five of those performed quite well on the reading tests.

BTW: Williamsburg had the lowest excess, 3.69%, but that datum disappears here into the average with James City County.

Excel is happy to calculate and plot a Bang/Buck ratio by dividing the RLE excess into the pass rate.  The result:

image

The Bang/Buck correlates well with the reciprocal of the Excess; no surprise there.

Looking just at the middle of the curve, we see the divisions of interest underperforming the fitted curve, Hampton only slightly.

image

The data for the other subjects tell much the same story (albeit with a hint of a correlation in the writing data).

image

image

image

image

All that said, we know that “economically disadvantaged” students underperform their more affluent peers and that Richmond’s large population of ED students puts RPS at a disadvantage in terms of raw SOL pass rates. 

Indeed.  Stay tuned for the next post.