2018 Richmond Crime Rates

The Virginia State Police publish an annual report on Crime in Virginia.  They count the “Type A” offenses reported per police unit:

Arson
Assault
Bribery
Burglary
Counterfeiting/Forgery
Destruction/Damage/Vandalism of Property
Drug/Narcotic Offenses
Embezzlement
Extortion/Blackmail
Fraud Offenses
Gambling Offenses
Homicide
Kidnapping/Abduction
Larceny/Theft
Motor Vehicle Theft
Pornography/Obscene Material
Prostitution Offenses
Robbery
Sex Offenses, Forcible & Nonforcible
Stolen Property Offenses
Weapon Law Violations

This year they classify offenses in three categories:

image

These data have their peculiarities. First, a counting problem: The published report [at p.8] for 2018 reports 89,701 “Crimes Against the Person,” while their (wonderfully complete but complex) database reports 105,004.  The report says (at page 8), “Several offenses may have occurred in one crime incident; therefore the total number of Group A offenses reported was 418,074.”  Yet the database total is 436,464.

Note: In one sense I’m being unfair. The data from the local sources change daily as after-the-fact information emerges and the database is updated to reflect those changes.  But the database has had six months to settle down and differences noted above are VERY large compared to the ordinary after-the-fact changes.

Whatever they are counting, the total should be the total. 

In any case, the numbers below are what they report in the database as of July 17-19, 2019.

They report the numbers by police agency, both the local force and, in most cases, the State Police.  For example, the Richmond Police Department shows 21,859 total offenses and the State Police show 233 in Richmond.  The report also includes data for the colleges (e.g., 1,269 for the VCU Campus Police), the Capitol Police (74 offenses), and state agencies such as the ABC Board (100 offenses).  Just as a statistical matter, the small jurisdictions produce some weird numbers because even a small number of crimes can produce a large change in the crime rate.  As well, the State Police report a significant fraction of the incidents in some small jurisdictions; for instance, in Craig County in 2018, the sheriff reported 26 offenses while the State Police reported 22.

The data below are for the jurisdiction’s local law enforcement agency only.

To start, here are the 2018 counts, expressed as Type A offense reports per 100 population vs. population.

image

Richmond is the gold square.  The red diamonds, from the left, are the peer jurisdictions, Hampton, Newport News, and Norfolk.  South Boston and Clifton Forge are omitted for lack of population data.

There is no particular reason to expect these data to fit a straight line but Excel is happy to fit one.  The slope suggests that the rate (per hundred population) increases by about 0.12 for a population increase of 100,000.  The R2, however, tells us that population explains less than half of 1% of the variance in the crime rate; i.e., overall crime rate (by this measure) does not correlate with population.

Here is the same graph, with the axis expanded to cut off the Big Guys (Fairfax, Va. Beach, Prince Wm., Loudoun, Chesterfield, and Henrico) in order to emphasize the distribution of the smaller jurisdictions.

image

The Top Twenty for offense rates all are cities.

image

And most of the Leaders there are little guys.

If we sort by population, we get:

image

Notice the large differences between the rates of the large cities and the large counties.

Just to cleanse the palate, here are the twenty jurisdictions with the lowest counts per hundred population (all are counties).

image

CAVEATS: These numbers tell us about overall crime rates but not about the environment faced by any particular citizen.  As well, the VSP emphasizes that, as we see above, population is not a good predictor of crime rate.  They list [at p.8] other factors:

1. Population density and degree of urbanization;
2. Population variations in composition and stability;
3. Economic conditions and employment availability;
4. Mores, cultural conditions, education, and religious characteristics;
5. Family cohesiveness;
6. Climate, including seasonal weather conditions;
7. Effective strength of the police force;
8. Standards governing appointments to the police force;
9. Attitudes and policies of the courts, prosecutors and corrections;
10. Citizen attitudes toward crime and police;
11. The administrative and investigative efficiency of police agencies and the organization and cooperation of adjoining and overlapping police jurisdictions;
12. Crime reporting practices of citizens.

Here are the rates for Richmond, the total for all jurisdictions (still of the local law enforcement agencies), and the peer cities for the most-reported offenses.

image

And some less common offenses:

image

And some still less common crimes:

image

The 2018 Richmond rate increased to 9.76 from 8.65 in 2017 while the state total continued a decline to just above half the Richmond rate.

image

As the graph at the top shows, the Type A total is driven by the property crime numbers.  To see how violent and drug crime are doing, we have to look underneath the totals.

When we do that, we see that the Richmond count of simple assaults rose a bit in ‘18, as did the drug offenses.

image

Note: This graph and those immediately below report the raw counts of offenses reported to the Richmond Police Dept., not the count per 100.  Throughout this period, the Richmond population has been just over 200,000, with very little change, so you can get close to the rates per 100 by dividing these numbers by 2,000.  For example, the quick number for drug offenses is just short of 1.0 per hundred while the actual rate is 0.90.

The robbery numbers continued a long downward trend; aggravated assaults continued to rise.

image

The murder and kidnap counts dropped.  The decreases from the early 2000’s, both here and above, are remarkable.

image

For a list of the hot blocks for drugs in Richmond see this page.  And see this page for data showing a nice improvement in Forest Hill (with some recent slippage).

Much of Richmond’s plethora of crime is drug-related

To complement the still outrageous crime rate, our schools are among the worst in the state and our public housing agency maintains a sanctuary for crime on its property.  To support all this dysfunction, we pay some of the highest taxes in the state.  Go figure.

Note: In the past, Mr. Westerberg of the VSP kindly furnished a copy of the data as an Excel spreadsheet, so I didn’t have to copy the numbers from the PDF report on the Web.  He now is enjoying his retirement and the VSP has put up a database with more numbers than one might want.

“No Tax Increase” Increases

Following last year’s increase on the meals tax (ca. $9.1 million per year), our Mayor asked Council for a nine cent increase in the property tax and a fifty cents per pack cigarette tax (ca. $24 million per year).  Having lost on the real estate increase this year, Mayor Stoney “did not rule out the ideal of proposing a real estate tax increase” next year.

All that sounds like the danger is at least on hold until next year.

But wait!  Councilman Agelasto’s July newsletter paints on a larger canvas:

  • Property assessments will rise about 8% this year;
  • Gas rates will increase 3.5%;
  • The typical water bill will increase by $1.41;
  • The typical wastewater bill will increase by $2.39; and
  • The typical stormwater utility rate will increase by 4%.

The BLS tells us the Consumer Price Index for urban consumers rose 1.8%, May ‘18 to May ‘19.

Try to not think about that as you dodge the potholes.  And as you hear about RPS’s demand for more money while it already is wasting some fifty million dollars a year.

Where Have All the Dollars Gone?

RPS Wants More Money But Already Is Wasting At Least $52.6 Million a Year

Table 13 in the Superintendent’s Annual Report gives us the 2018 disbursements for each of 132 school divisions, along with the end of year Average Daily Membership (“ADM”).  If we calculate the total disbursements, leaving out facilities, debt service, and contingency reserve, we see that Richmond is the eleventh most expensive division per student.

image

Not only does RPS spend more per student than the average division, it spends still more than the school systems of the peer cities, Hampton, Newport News, and Norfolk.

image

Table 13 breaks out the expenditures in ten basic categories (as well as the three I’ve left out).  Here are those data for Richmond, expressed as differences from the per student division averages.

image

The $1,659 Richmond excess in the Instruction category accounts for 57.5% of the $2,887 total.

If we multiply these numbers by the Richmond ADM, we get the excess spending, here expressed in millions of dollars.

image

If Richmond paid higher than average salaries or had more teachers than average, it could help explain (perhaps even justify) some of that $38.2 million excess “Instruction” spending.

We have data on that:

  • The average instructional salary in Richmond (Table 19) was $53,138.87 while the division average was $58,677.08.  Multiplication of  the –$5,538.21 difference by Richmond’s 1,905.22 positions gives –$10.6 million.  Richmond is saving $10.6 million vs. the average division by its lower than average salaries.
  • Dividing Richmond’s 1,905.22 instructional positions by the 23,036.78 ADM gives 0.08270 teachers per student.  The state number is 0.08583.  The difference is a Richmond deficit of 0.00313 teachers per student, which at that $53K average salary is a saving of $3.8 million vs. the division average.

Thus the number of teachers and average salary serve to exacerbate, not mitigate, Richmond’s excess spending for instruction (here, in millions of dollars):

image

The SOL pass rates give a measure of the return for that $52.6 million excess.

Note: As to SOLs, we have seen that Virginia’s economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by some 15 to 20 points, depending on the test.  This makes the division SOL average a biased measure that punishes the divisions with larger percentages of ED students.  So let’s look at the pass rates for both groups, not at the division average pass rates.

image

That’s clear enough: Division average pass rates do not correlate with the spending.  Richmond (the enlarged points with gold fill) spends a lot of money and gets wretched pass rates.  The peer cities (the red fill, from the left Hampton, Newport News, and Norfolk) do much better on much less money.

The math data tell the same story, albeit with an even more dismal showing by Richmond.

image

Curiously, with an R-squared of 7.9%, the ED data for math suggest a slight negative correlation between pass rates and spending.

In light of this, we can wonder why City Council this year gave RPS an extra $37 million of our money without any showing that the extra money would help anything and without any trace of accountability of anyone for the effect of the new money or the lack of effect of that excess $52.6 million in 2018.

Bigger Bucks Vanishing Into Failure

We have seen that, for the most part, Richmond pays its teachers less than average

image

Let’s see if we can figure out where the money is going and what we’re getting for it.

Table 13 in the Superintendent’s Annual Report shows disbursements by division.  The data below are from the 2018 table, not including the spending for facilities, debt service, and contingency.  The Richmond total is $361,601,063.

(That’s just north of a third of a BILLION dollars.)

The table also gives the end-of-year Average Daily Membership (“ADM”).  Dividing the total disbursement by the ADM for Richmond, three peer cities, and the division average, gives us:

image

image

Breaking out that $2,887 per student excess by category gives us:

image

Note:  The table tells us that the preK spending includes Head Start; the “other” educational spending goes for enterprise operations, community service programs, and other matters that do not involve the delivery of instruction or related activities for K-12 students.

Here we see particularly large spending for attendance and health (6% of the $2,887 excess), food services (8%), pre-kindergarten (19%), and “other” educational programs (6%).  Those are the small change, however: 57% of the excess goes for instruction.

The only way to get this unusually large expenditure for instruction while paying most teachers less than average is to have a lot of teachers or spend a lot of money on something other than teachers’ salaries.

Let’s look at that from two directions: First, the relationship, if any, between disbursements and SOL pass rates; second, the relationship, if any, between number of teachers and pass rates.

We know that, in terms of SOL pass rates, Virginia’s economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by ca. 15-20%.  This renders the division average pass rate unfair to those divisions with relatively large ED populations.  So let’s look at the data for both groups, not at the overall division averages:

image

Richmond is the gold points.  Whatever they’re spending all that money on, we’re getting precious little return.

The peer cities, Hampton, Newport News, and Norfolk are the red points, left to right.  They are spending a lot less per student than Richmond and getting a lot more performance.

Lynchburg is blue; Charles City, light green.

The fitted lines suggest that pass rates for both groups decrease with increasing disbursements but the R-squared values are tiny for the Not ED group and right small for the ED.

The math data tell much the same story, but with R-squared values that suggest a bit more of a (negative) relationship between pass rate and spending.

image

In any case, the divisions that spend more per student do not get better SOL pass rates on average.

The Superintendent’s Table 19 gives us the number of instructional positions (classroom teachers, guidance counselors, librarians, technology instructors, principals, and assistant principals) per student.   Juxtaposing those data with the pass rates gives:

image

BTW: The peers have rearranged here.  From the left it’s Newport News, Hampton, and Norfolk.  Notice that Hampton and Norfolk have more instructional positions per student that we do, while spending less per student.

Go figure!  Richmond is spending lots of money on instruction and not paying most of its teachers an average salary but operating with one of the lowest instructional position/student ratios in the state. 

And we’re getting precious little for it.

I’ll save the spending puzzle for another post.  First hypothesis: Richmond is paying a lot to people other than teachers.

In the meantime, these data tell us that division average reading pass rates increase with the instructional position/pupil ratio.  The R-squared for the ED students, 5.9%, says there’s a hint of a correlation (ρ = +0.24).  For the Not ED pass rates, we also see a positive slope but the R-squared is less than 1%..

The math data tell the same story but with even lower R-squared values.

image

So, as to Richmond: More money, fewer instructional positions, lower teacher pay, awful pass rates.  Whatever the problem with Richmond’s schools may be, it’s not a shortage of money.

Pop Quiz

It’s Spring!  The data VDOE has had since October have made it into the Superintendent’s Annual Report.  In particular, we now have the salary data.

Please review the following excerpts from that report in preparation for the quiz:

image

image

Did you study carefully? If not, that’s OK. The quiz is open book.

——————————————————————–

Quiz:

  1. If you were fresh out of college with a teaching license and were thinking about living in the Richmond area, would you prefer an elementary school job with RPS or one of the Counties?
  2. If you had taught elementary grades in Richmond for a couple of years and had demonstrated that you were a capable teacher, would you be tempted to jump ship for one of the Counties?
  3. Will this salary structure, in combination with the tough clientele and awful politics in Richmond, do a reverse cull of the the better teachers, tending to leave Richmond with the less effective ones?

——————————————————————–

Time’s up.  Turn in your papers.

Congratulations!  You’ve aced it!

Now, for your homework: Prepare for tomorrow’s quiz by learning to count to ten.

OOPS!  I flunked a bigger test:  A kind reader pointed out that I mixed up the labels on both graphs.  Now fixed.

It’s the Principal of the Thing

On April 26, the Free Press posted a story listing ten RPS schools where the principals were said to be being replaced.  There was an RT-D story about the same time; it has since been taken down, perhaps because of an error in the list.  Now the RT-D has a final list along with the names of the new principals.

The Grade 8 and EOC testing window this year had a last reporting date of March 29; the end of the latest test window for other tests was June 21.  RPS surely had the high school results but probably not the results for the lower grades before the list leaked.  No telling when the list was prepared but, below grade 8, it had to be before RPS had all the 2019 SOL results.

In any case, VDOE will have the summer results on July 19 but won’t release the final pass rates until mid-August, so we are stuck with the 2018 and earlier data.  Let’s see what those numbers tell us about the ten schools.

Here, as background (you’ll soon see why), are the reading pass rates at Carver.  The 2018 data are missing because the staff there got caught cheating.

image

The “ED” data are the pass rates of the “economically disadvantaged” students.  The “Not ED” points are the data for their more affluent peers.  The missing Not ED data reflect the VDOE suppression rule (no data when a group is <10 students).

The “All” data are the school average.  Here, the very large percentage of ED students makes the ED and All rates nearly equal.

The yellow line is the nominal accreditation value, 75% for English, 70% for everything else.

To the point here,

  • Until the cheating was halted in ‘18, the pass rates for both groups were astronomical, and
  • The ED students were reported to pass at about the same rates as the Not ED, in contrast to the state average difference of some 15 to 20 points.

The math data from Carver tell much the same story.

image

Those data paint a portrait of cheating.  By the school staff, not by the kids.

As a contrast and to set an upper bound for reasonableness, here is Munford, the best-performing of Richmond’s elementary schools in terms of SOL pass rates.

image

image

In that light, let’s look at Fairfield Court, which is one of the ten on the list.

image

image

There are no ED data for 2014.  Whatever the reason, it clearly is not the suppression rule.

These data suggest (but, of course, do not prove) two things:

  1. They were cheating at Fairfield, just not as flagrantly as at Carver, and
  2. Before the 2018 testing, they got the Word that the State was becoming involved and let the numbers return to their actual level.

Of course, the 2018 score drop alone would be a reason to wonder whether it was time for a new principal.

Moving on to the rest of the list, here is Bellevue.

image

image

The ED decreases in ‘17 and ‘18 and the Not ED in ‘18 probably offer a complete explanation for the new principal.

Next Blackwell.

image

image

Ongoing, sorry performance could explain what happened there.

George Mason:

image

image

Sorry performance in reading and declining math scores.

Ginter Park:

image

image

Hard to know what to make of this: The Not ED rates are unusually close to the the ED, with the exception of reading in the last two years; perhaps more to the point, the reading scores are low and neither subject shows ED improvement.

Next, Greene.

image

image

Declining performance, when increasing is what’s needed.

Last among the elementary schools, Overby-Sheppard.

image

image

Could be lack of improvement; the generally small gap between ED and Not ED may indicate a problem.

Turning to the middle schools, here as a marker is Hill, the best of a sorry lot (and NOT on the list).

image

image

(Notice the ED/Not ED gap that runs about twice the state average).

Henderson is the only middle school on the list.

image

image

(Notice the abnormally thin ED/Not ED difference.)

As to the principal, low pass rates and a lack of recent improvement might well explain the turnover.

Then we have the high schools.  First on the list, Wythe.

image

image

The reason for the turnover there is clear enough.  As well, the ED/Not ED difference is remarkably thin.

Next, Marshall.

image

image

Hmmm.  No smoking gun there but another thin ED/Not ED gap.

Next, TJ.

image

image

No obvious reason here for making the list.

Last, one of the two selective high schools, Community.

image

image

If there’s a problem here, it’s not in reading.  Nor in the improvement of the math rates.

Bottom line: Except for a few obvious cases, these data don’t tell us what brought in the new principals.  The data do tell us that all ten principals have big jobs to do.

—————

Postscript: Our neighborhood school, where the principal was replaced last year.

image

image

Good News for Transparency

The estimable Carol Wolf just sent me a copy of the order in Paul Goldman’s Freedom of Information Act suit for documents regarding the Coliseum boondoggle.  These include a stack that the City released to the Times-Disgrace but wanted over $2,000 to “review” before they would even consider giving them to Paul. (Can you spell “That Stinks!”?)

My handy ‘puter converted this from the pdf; any errors (notably the format) belong to that process, not to the Court:

Virginia:

In the Circuit Court of the City of Richmond, John Marshall Courts Building

clip_image002 Case No.: CL19-2591-6

ORDER

On May 29, 2019, CAME Paul Goldman (“Petitioner”), pro se, and the City of Richmond (“Respondent”), by counsel, on Petitioner’s Petition for Mandamus (the “Petition”), in which Petitioner requested an order in mandamus compelling the Respondent to produce certain documents previously requested by Petitioner in letters dated March 21, 2019 and April 25, 2019, pursuant to the Virginia Freedom of Information Act, Virginia Code 2.2-3700 et seq. (“FOIA” or the “Act”). The letters correspond with Count I and Count Il of the Petition, respectively.

WHEREUPON, with the parties in apparent agreement that the documents covered in the Petition are “public records” subject to disclosure unless excluded under the Act, the hearing proceeded with Petitioner first offering an opening statement followed by that of the Respondent. Respondent presented several documents during its presentation and made argument concerning their meaning and legal import but did not move them into evidence. Throughout, Petitioner argued and objected to Respondent’s position, observing that under the Act, “the public body shall bear the burden of proof to establish an exclusion by a preponderance of the evidence.” Va.

Code Am. S 2.2-3713(E). This requirement extends to Respondent’s affidavit of a witness which purported to detail the reasons and circumstances that would support Respondent’s argument that the information sought in the Petition was subject to ongoing negotiation and thereby excluded from disclosure under Virginia Code 2.2-3705.1(12). The Court stated that it was unwilling to accept statements by counsel for Respondent as evidence over objection. This addressed the information sought by Petitioner in Count I of the Petition.

As to the information sought in Count Il, Petitioner contends that he now seeks only 2643 documents that Respondent has previously provided to the Richmond Times Dispatch (the “RDT”), pursuant to a FOIA request. Respondent contends that before it can know what documents to disclose, if any, Respondent needs to examine each document to determine whether disclosure is proper. Further, Respondent, by counsel, stated that some documents were provided to the RDT inadvertently and may not be subject to disclosure. Respondent contends that the case of Tull v. Brown, 255 Va. 177, 494 S.E.2d 855 (1998) stands for the proposition that a public body’s prior disclosure of information does not waive its claim of exemption for the same information in a subsequent request, for the well-known reason that estoppel does not operate against the government. The Court finds, however, that the Tully case is distinguishable from the case at bar, and accordingly, not controlling for reasons stated on the record.

WHEREFORE, it is hereby ORDERED, ADJUDGED AND DECREED that Respondent

shall disclose the documents requested in the Petition upon payment by Petitioner of costs in the amount of $50.00 for disclosure under Count I and costs in the amount of $200.00 for disclosure under Count Il, within ten (10) days of such payment. It is further ORDERED that Petitioner’s request for attorney’s fees is DENIED.

The Clerk is directed to forward a copy of this Order to all parties.

Exceptions are noted.

clip_image004IT IS SO ORDERED.

ENTER: 6 / 7/ 19

clip_image006

Help for the Board of “Education”

Now that our Board of “Education” has demonstrated that it does not know how – more likely, does not wish – to inspect its database in order to detect official cheating to boost graduation rates, I thought I’d give them an easy place to start doing their job.

And give the Governor a list of places where the Board’s malfeasance probably has allowed school divisions to issue bogus diplomas wholesale.

Brief background:

  • The End of Course (“EOC”) SOL tests are the gateway to graduation; to earn a standard diploma, a student must pass two English EOC tests, one math, one history & social science, and one lab science.
  • On average, the pass rates of Economically Disadvantaged students (“ED”) on the EOC tests is about 17% less than the rate of their more affluent (“Not ED”) peers; consistent with that, ED students graduate at about a 6% lower rate than Not ED.
  • Richmond’s ED students in 2018 passed the reading and math EOC tests at a rate 13% lower than the Not ED students but, as a result of wholesale cheating (by the schools, not the students!), the ED cohort graduation rate was 7.5% higher than the Not ED rate.
  • The Board of “Education” did not notice this discrepancy (nor similar warnings in the data from earlier years).

Let’s start with a plot of the 2018 division average differences between the ED and Not ED graduation rates against the difference in EOC reading pass rates.

image

The state average is the red circle showing an ED pass rate 15.6% below the Not ED rate and an ED graduation rate 6.0% below.

Richmond, the gold square, is up in the anomalous zone, with an unexceptional pass rate difference but with an ED graduation rate 7.5% higher than the Not ED rate.  We now know that the reason for that anomaly is wholesale cheating.  By the schools.

As a courtesy to my readers there, the blue diamond on the graph is Lynchburg; the green, Charles City.

BTW: The points over to the right side of the Y-axis are divisions where the ED students passed at higher rates than the Not ED.  If the Board of “Education” elects to start doing its job, it should also look at those divisions to see whether they are messing with the classification of handicapped students or otherwise manipulating the data.

The math data paint a similar picture.

image

Let’s take a closer look at the divisions that joined Richmond in the group with anomalously high ED graduation rates.  (That’s not to say that some of these other data are not abnormal.  I’m just going for the obvious cases.)

image

The math data again tell much the same story.

image

Small populations can lead to large variability in averages so these data, standing alone, don’t say much about the smaller divisions.  But, for sure, the data wave a large, red flag over the ED graduation rates from Norfolk, Newport News, Va. Beach, Portsmouth, Arlington, Roanoke, and Lynchburg.

But don’t hold your breath waiting for the Board of “Education” to look behind those numbers.

High School Corruption and State Malfeasance

In 2016, VDOE conducted a course schedule audit at Armstrong as part of the work following denial of accreditation there.  VDOE reported discrepancies in course requirements and transcript accuracy, inter alia.  The 2017 follow-up audit “concluded that there was not sufficient evidence to determine that problems identified . . . had been resolved.”

In 2018, at the request of our Superintendent, VDOE expanded the audit to include all five mainstream high schools.  They found:

  • Bell schedules did not meet the number of hours required by the Standards of Accreditation.
  • Verified credits did not populate in the transcripts.
  • Attendance data was incorrect throughout the transcripts.
  • Some students received one credit for classes that should not have carried credit.
  • Some students received two credits for classes that should have carried one credit, such as Career and Technical Education (CTE)classes.
  • Credit was incorrectly given for what appear to be locally developed elective courses without evidence of approval by Richmond Public Schools Board.
  • Credit was incorrectly given for middle school courses ineligible for high school credit.
  • Course sequencing issues were identified.
  • Academic and Career Plans lacked meaningful content.

Translating that from the careful bureaucratese of VDOE: The schools were cheating wholesale to boost the graduation rates.  Indeed, the RT-D later reported the schools were:

Rubber-stamping student work. Choosing to use an alternative test instead of giving students the common state test. Putting students on individualized education programs to circumvent state graduation requirements.

As Scott Adams is fond of saying,

Wherever you have large stakes, an opportunity for wrong-doing, and a
small risk of getting caught, wrong-doing happens. That’s a universal law of
human awfulness. When humans CAN cheat, they do. Or at least
enough of them do.

RPS seems to be determined to prove Adams right.  This latest outrage, comes in the wake of the wholesale cheating at Carver.

To be clear: We’re talking here about cheating by the school staff, not by the kids.

In the past, this kind of thing has been manifest in the data.  An earlier post looked at the RPS pass rate averages and suggested cheating.  So I thought I’d take a (further) look at the graduation numbers.

Notes on the data:

  • Economically disadvantaged students (here, “ED”) underperform their more affluent peers (“Not ED”) on the SOL pass rate averages by somewhere around twenty points.  Where there are large ED populations, the overall SOL averages can be misleading, so we’ll look at the ED and Not ED pass rates separately.
  • The End of Course (“EOC”) tests are the gateway to obtaining “verified credits” toward the graduation requirements.  To graduate with a standard diploma, the student must pass two EOC tests in English, one in math, one in history & social sciences, and one in a lab science.
  • On the 2018 EOC tests, about 60% of the Richmond students tested were reported to be ED.

image

  • That’s an average.  The tested ED populations of our mainstream high schools vary considerably (the selective high schools all do very well, thank you, and the boot camp Richmond Alternative is a mess so there’s nothing to be learned here from their data).

image

image

  • The graduation rates here are the boosted rates the Board of “Education” uses to make their constituent schools look better than they really are.

With that out of the way, let’s turn to the cohort graduation rates of those five schools and the pass rates there on the EOC tests. 

First, on the reading tests:

image

The orange diamonds are the 2018 cohort graduation rates/2018 pass rates of the ED students at the five high schools.  The blue circles, the same data for the Not ED students.

The easy way to read the graph: Look at the slope from orange ED to blue Not ED: 

  • Up and to the right is the expected condition with the ED students both scoring below and graduating at lower rates than the Not ED. 
  • As the line approaches horizontal, the ED pass rate is lower but the graduation rate approaches the Not ED rate.  We can wonder what is going on.
  • When the line slopes down, the lower scoring ED students are graduating at a higher rate than the Not ED.  Think cheating.
  • When the line approaches or passes vertical, ED students are passing at about the same rate as the Not ED, and graduating at a higher rate.  Something is doubly rotten.

Here, the Armstrong data look reasonable: The Not ED students outscored the ED by almost fifteen points and enjoyed an almost ten point better graduation rate.  We can wonder whether this is the result of the earlier audit there (see below).

In contrast, the ED students at both John Marshall and TJ graduated at higher rates than their Not ED peers while passing the EOC tests at 20+ and ~15 percent lower rates.  Something boosted those ED graduation rates.  Cheating is the obvious first candidate.

Wythe shows a more striking, anomalous pattern, with a (smaller than expected) 10% pass rate deficit and a twelve point higher graduation rate for the ED students.

Huguenot is anomalous in two respects: The pass rates of the two groups are almost identical and the ED students graduated at a rate almost 30% higher than the Not ED.  We can wonder whether the large ESL population there bears on this.

The math data tell the same stories (at dishearteningly lower pass rates).

image

Armstrong’s data again look reasonable.  Wythe and, especially, Marshall and TJ show higher ED graduation rates than Not ED, with lower pass rates.  Huguenot shows a much higher ED graduation rate at about the same pass rate.

To gain some context, let’s look at some history. 

First, as background, the state averages.

image

image

There is the norm: Slope up from orange to blue.  In terms of numbers, ED pass rates about twenty points below the Not ED and cohort graduation rates about seven points below.

Turning to Richmond, here is Armstrong on the reading EOC tests:

image

2014 and 2015 clearly suggest cheating.  2016 suggests it; 2017 says it again.  But 2018 looks reasonable. 

We might infer that the presence of VDOE in 2016 had a salutary effect at Armstrong.  A caveat: Because of the very small numbers of Not ED students at Armstrong, the Not ED pass and graduation rates can be expected to fluctuate considerably from year to year (as they do here).  These data fit the inference of cheating before ‘18 with a course correction then but there might well be a more benign explanation.

The math data at Armstrong also fit the inference of cheating, abated upon the appearance of VDOE.

image

The Marshall data are consistent with unabated cheating, albeit with higher pass and graduation rates than at Armstrong.

image

image

The Wythe data tell a similar story, with a suggestion of robust cheating in 2018.

image

image

Huguenot data suggest ongoing, rampant cheating.

image

image

On both tests the 2015 Huguenot numbers show ED and Not ED pass rates that are nearly identical, so the positive slope that year does not offer any solace.

Finally we have TJ, with anomalous but OK patterns in 2017 (albeit with unacceptably low ED pass rates) but otherwise showing data consistent with cheating.

image

image

————————– Rant Begins ————————-

Our Board of “Education” has created a haven for cheating high schools:

  • They don’t look at their mountain of graduation data until after they review a school that has been denied accreditation (if they even look then), and
  • They now have rigged the system so it is almost impossible for a school to be denied accreditation.

The Board has institutionalized Adams’ “small risk of getting caught.”

Why did VDOE ignore its mountain of data and do nothing in Richmond until (1) Armstrong was denied accreditation (under the old, now replaced, system), and (2) our Superintendent invited them in to the other high schools?  Why are they not looking for this pattern at high schools in other divisions? 

I think the answer is at least nonfeasance and probably malfeasance.  It’s past time to fire the Board and appoint people who will actually do the job.

————————– Rant Ends ————————-

It will be interesting to see how all this affects this year’s graduation rates.  Those data should be available in mid-August.  Stay tuned.

Oops: Corrected the link and list of required EOC courses.

Feeding and Failing the Kids

The VDOE data show each school division’s spending for the day school operation (administration, instruction, attendance & health, pupil transportation, and O&M) and then for food services, summer school, adult education, pre-kindergarten, and “other” educational programs (“enterprise operations, community service programs, and other non-[division] programs”).  The previous post discussed the very large expenditures beyond day school for Richmond and some other divisions; that spending correlated fairly well with the division percentage of economically disadvantaged (“ED”) students.

An astute reader asked whether the spending for food services showed a correlation, particularly in the light of the federal support from the National School Lunch Program and the School Breakfast Program for ED students. 

Indeed, there is a decent correlation.

image

Richmond is the gold square.  The red diamonds are the peer cities, from the left Hampton, Newport News, and Norfolk.  Lynchburg is purple; Charles City, green.  Richmond is $108 (18%) above the fitted line.

To look at those figures in context, here they are again (green this time), along with the total spending per student beyond the day school operation (“Excess”) and that total minus the food expenditure.

image

The excess shows a modest correlation while excess minus food shows very little; that correlation in the excess clearly comes from the food expenditure.

As with the overall Excess spending, the SOL pass rates show a modest negative correlation with the food service expenditure.

image

image

Again, as with the overall Excess spending, most of that negative correlation is driven by decreases in pass rates of the students who are not economically disadvantaged.

image

image

These data sing the same song: This extra spending is not associated with more learning.  We can wonder why it’s in the education budget.

We earlier saw that increasing a division’s percentage of ED students is associated with a decrease in the pass rates of the Not ED students more than the ED; these data are consistent with that (but do not proffer an explanation).

Finally, we have the SOL performance vs. the Excess spending other than for food services:

image

image

There is a bit of a correlation here, but in the wrong direction. 

Closer to home, the Richmond spending remains a substantial outlier.  RPS is spending $856 per student more in non-food Excess than the average division; that $19.7 million is buying the Reading and math deficits you see in these graphs. 

RPS keeps demonstrating that it does not know how to competently run its day school operation.  While they are thus failing to educate Richmond’s schoolchildren, they are spending a lot of money on things other than the day school operation. 

If money were as important to performance as they say it is (albeit money in fact is irrelevant to performance), you’d think they would redirect some of those dollars.  Go figure.