WSJ College Rankings

The Wednesday edition of the Wall Street Journal contains, behind the paywall, their annual rankings of “nearly 800 U.S. colleges and universities.”

The Top Ten are Harvard, Stanford, MIT, Yale, Duke, Brown, CalTech, and Princeton, with Johns Hopkins and Northwestern tied for tenth.

The top Virginia schools are:

image

The WSJ explains their methodology:

Rankings are based on 15 key indicators that assess colleges in four areas: Outcomes, Resources, Engagement and Environment. Outcomes accounts for 40% of the weighting and measures things like the salary graduates earn and the debt burden they accrue. Resources, with a 30% weighting, is mainly a proxy for the spending schools put into instruction and student services. Engagement, drawn mostly from a student survey and with a 20% weight, examines views on things like teaching and interactions with faculty and other students. Environment, at 10%, assesses the diversity of the university community.

The Web version of the article has a helpful tool that lets us look into the rankings of Virginia’s schools.

image

They don’t list Virginia State.

One major reason for attending college, of course, is that college graduates make more money. (“Most 4-year schools deliver a 15 percent return on investment – double that of the stock market.”)

Plotting the ranking v. outcome data for the thirteen Virginia schools ranked above 400, we see:

image

The three “400” outcomes there are “>400” in the data, i.e., all three would be somewhere off the left side of the graph if the rankings broke out the >400 group.

The dashed line denotes ranking = outcome. Schools above that line received rankings better than their outcomes; schools below, rankings lower.  For example, THE University is in the higher outcome group with a 30th place outcome but 55th rank.

JMU is the outlier here with an outcome of 106 but a ranking of only 286. No telling how far HSC, VCU, and EMU lie from the line.

Mary Washington is interesting but does not show on the graph: Ranking is somewhere in the 500’s but outcome = 325.

Turning to ranking v. cost, we see, again for the top thirteen:

image

Finally, outcome v. cost (again with the three >400 outcome entries plotted at 400 but actually ranked lower, perhaps much lower, than 400).

image

2021 SOL Results, First Cut

2020 was the first spring since 1998 without SOL tests in Virginia. Then came 2021, when participation in the testing was voluntary.

The VDOE press release says, “[2020-21] was not a normal school year for students and teachers, in Virginia or elsewhere, so making comparisons with prior years would be inappropriate.” The first line of the very next paragraph of the press release then quotes the Superintendent making a comparison: “Virginia’s 2020-2021 SOL test scores tell us what we already knew—students need to be in the classroom without disruption to learn effectively.”

Let’s look at some data and see whether they offer any principled implications.

But first: As we have seen, economically disadvantaged students (“ED”) underperform their more affluent peers (“Not ED”) by around twenty points, depending on the test. This renders the school and division and state averages meaningless because of the varying percentages of ED students. Fortunately, the VDOE database offers data for both groups. Hence the more complicated analyses below.

To start, let’s look at the numbers of students tested by year for reading in Richmond and statewide.

image

As we might expect, the ED tested counts dropped in 2021 even more precipitously than the Not ED. In Richmond, the Not ED decrease was about double the state average, the ED, nearly three-fold.

image

The pass rates (of those fewer students) fell, relative to 2019, but not as far as we might have feared.

image

The Richmond ED score drop was nearly twice the state average; the Not ED, a bit short of 1.5 times.

image

Note, however, that even a 6.7 point drop in the pass rate is huge.

We can only speculate about the effects of the various factors that might lie beneath these numbers. Those factors might include:

  • Quality of the online instruction,
  • The students’ capability to learn online,
  • Students’ efforts in the presence of an online teacher,
  • Parental desire to see results for their children,
  • Parental opposition to SOL testing,
  • Parental concern for COVID exposure during the testing, and
  • Relaxed graduation requirements.

The ED/Not ED difference increased, both in Richmond and Virginia.

image

The math counts show decreases similar to the reading.

image

image

Those who took the math tests, both ED and Not, did not do well.

image

image

And, again, the Not ED/ED gap of those tested in ‘21 increased.

image

The H&SS test counts were so low that the pass rates must be close relatives of meaningless.

image

image

Hmmm. Even so, the Richmond Not Ed pass rate edged above the state average.

The science counts were close to the math numbers.

image

image

The writing data painted yet another messy picture.

image

image

Note the increase in the Richmond ED rate. Looks like that 13.3% were a special group.

There is one clear inference available from these data: The test results may well have been helpful to individual students but making the SOL testing voluntary made the collected results meaningless. Those collected results do not even give any measure of “what we already knew,” although they are not inconsistent with the notion that the online instruction did not work well. We’ll have to wait for the 2022 results to get a clearer idea of the COVID effects.

That’s enough for one post. It will take some time to mine the information in this year’s data; if you’re interested, please stay tuned.

College Education: Debt and Income

The Wall Street Journal  has a piece (behind a paywall) comparing student debt and subsequent incomes for a number of college programs. A calculator near the bottom of the page allows the reader to select the degree and then any college(s) to emphasize on a graph. The data are ratios of median debt for graduates in “roughly” 2014-15 to median income two years later.

Caveat: Data are only for students with federal student loans and may not be representative of results for students who did not take out such loans. As well, the data are medians and do not capture the experiences of students who borrowed or earned much more or less.

To start, here is the graph for medical schools, with data for seven selected institutions:

image

Unfortunately for clarity here, the Journal sorts by school name, so we have to look at colors and ratios to find the selected schools on the graph.

The graph shows the ratio of debt to income at each school; the table below the graph shows those ratios as well as the median debt and income for the selected schools. The vertical axis merely expands to allow points for multiple schools in a given debt/income range; position in that direction is irrelevant.

I selected Harvard, Stanford, and Columbia here because of their position at the low ratio end of the list. Otherwise, my selections here and below have “Virginia” in the school name or were spotted by my scan of the list.

We might expect graduates of prestigious schools such as Harvard, Stanford, and Columbia to enjoy relatively larger incomes; that is the case here but the low ratios also reflect lower debt levels.

UVa is the best of the Virginia batch, with a 2.19(!) debt to income ratio.

With a little help from Professor Bill Gates, we can recast the data as a plot of income v. debt.

image

The upper left corner there is where you’d want your kid.

The two outliers here, University of Puerto Rico-Medical Sciences and Southwest College of Naturopathic Medicine & Health Sciences, compress the rest of the graph. Expanding the income axis to remove those two gives a more detailed picture.

image

Here the relatively higher incomes and lower debt of Harvard et al. stand out. Likewise the more modest and roughly similar incomes and higher to much higher debt levels at the Virginia schools other than UVa.

We have to wonder whether these modest (for doctors) incomes reflect the going rates for interns and not for doctors out in practice. The WSJ  article does not contain the word “intern.”

Turning to law schools we see:

image

Here, THE University is playing with the Big Guys, with hefty incomes offsetting the relatively high debt level.

image

Perhaps interestingly, the bulk of the law graduates here see incomes in the vicinity of $60,000, whatever their debt levels.

In contrast to the case of medical schools, most law graduates go to jobs, not internships.

In any event, these and the med school data might serve to raise questions about Willie and Waylon’s advice to mothers to “let [your babies] be doctors and lawyers and such.”

Likewise dentistry (but see VCU beat out Harvard on both the ratio and incomes(!)).

image

image

Some (many?) dentistry graduates take multi-year residencies so, as with the doctors, these data may paint a distorted picture.

Turning to bachelor’s degrees, those in “Business/Commerce General” paint a much prettier picture.

image

We probably should wonder about these incomes: Many of the better students in bachelor’s programs go on to graduate school and won’t show in the WSJ  data.

In any event, Chemical Engineering.

image

And Chemistry.

image

Computer Science.

image

Criminal Justice and Corrections.

image

English Language and Literature, General

image

Fine and Studio Arts.

Health and Physical Education/Fitness

image

Mathematics:

image

Music:

image

Psychology, General.

image

Note: Eight “Virginia” colleges are in the overall list (looks like nearly everybody teaches psychology) but VCU, VPI, and Wesleyan are off scale on the graph because of the crowd.

As noted above, these data, especially the medical, dental, and undergraduate numbers, need to be viewed with some skepticism. In any case, they point to an important consideration. For sure, if I had a kid thinking of college, I’d get out the computer and start a discussion about expectations.

2020 Teacher Salaries

It’s Spring and the data in the lower half of the  2020 Superintendent’s Annual Report have sprouted.

Table 19 reports on salaries in some detail. As well, it provides an overview report of division average salaries of  “All Instructional Positions” (classroom teachers, guidance counselors, librarians, technology instructors, principals, and assistant principals). Here is a summary of those summary data:

image

The gold bar is Richmond. The red bars are the peer cities, from the left Hampton, Newport News, and Norfolk. The blue bar is Northumberland, which is just a few dollars above the division average. The green bar is Lynchburg (with a hat tip to James Weigand).

Here is a list of the eight Big Spenders, along with the Richmond peers and two averages. The right hand column is the difference from the division average.

image

Falls Church leads the pack with an average salary 54.4% above the division average.

The “Teacher Average” is the average salary over all instructional personnel, while the “Division Average” is the average of the division averages. The former number is much larger, primarily because of the higher salaries and larger numbers of personnel in those large NoVa divisions. Notably: Fairfax has 16,363.92 positions (I’d like to meet that 0.92 person) at a $79,554 average.

Finally, here are Richmond and the suburbs.

image

Less Crime in Forest Hill (For Now?)

The daffodils are shouting “Spring!” and it’s time to update the Forrest Hill crime report data.

In the Police Department database, the neighborhood runs from Forest Hill Park to the boulevard and from Forest Hill Ave. to the river.

As you see, this does not include all of the Forest Hill Neighborhood Ass’n area and does include some of the Westover Hills Neighborhood Ass’n area.  It includes only one side of Forest Hill Ave, notably only one side of the Westover Hills Blvd. commercial area.

Micro$oft has a nice satellite view of the area.

For the period from the start of the Police Department Database, January 1, 2000, through December 31, 2020, that database contains 2268 offense reports for the neighborhood. Among those entries, “theft from motor vehicle” is the most common at 26%.

image

I like to call those incidents “car breakins” but that is not accurate: Most of those are cases where park visitors left the car unlocked. “Abandoned property in car” might be more accurate. The count of “Destruction property/private property” incidents gives a high but approximate measure of the actual breakins. 

image

As usual in a quiet neighborhood, most of the incidents involve property crime.  In the present case, the most frequent violent crime is simple assault, in tenth place behind 68% of the total (ninth place, 64%, if we don’t count the 63 natural deaths).

The neighborhood was enjoying a (fairly) consistent pattern of improvement until 2015.

image

The increases then were largely driven by theft from motor vehicle (and associated entertainments enjoyed by the criminals chummed into our neighborhood by the goodies left in parked cars).

image

By far our worst block for crime is 4200 Riverside Dr.

image

That block is home to the 42d St.Parking Lot (and lots of on-street parking for the Park).

image

Half of the crime reported in the block is theft from motor vehicle, with second place going to property destruction, often the case when the thief broke in because the car was locked.

image

image

No telling how much of the rest is spillover from the criminals lured into our neighborhood by the cars with valuable property left on the seats.

The earlier decreases in the 4200 block came after Parks’ 2005 response to neighborhood complaints: They started locking the gates to the 42d. St. lot at night and off season and they installed rocks to block parking in the part of the lot that is less visible from the street.

image

I attribute the recent increases to the increased use of the Park, the removal of the rocks in 2016, and the reassignment of Stacy, the bicycle cop.

Time will tell whether the nice improvement in 2020 was fruit of the pandemic. For sure, the improvements were not limited to the thefts from cars.

image

Aside from 4400 Forest Hill (mostly the nursing home) and 4700-4800 Forest Hill (the commercial area), the other blocks at the top of the list lead with (for sure, park-related) theft from motor vehicle:

image

image

There are at least two lessons here:

  • Leaving stuff in the car, especially in an unlocked car, is an invitation to lose the stuff and to help chum the neighborhood for criminals; and
  • Given that almost all of the thefts are from the vehicles of park visitors (most of them naïve County residents), and that this crime is entirely preventable, it’s two decades past time for some LARGE signs in the 4200 block and at the Nature Center and 4100 Hillcrest and, especially, in and near the 42d St. parking lot, to warn the visitors:

                      Car Breakins Here! Lock your junk in your trunk.

Somehow, the City manages to post such signs at Maymont but not in Forest Hill.

080112 007

2019-2020 Attendance

It’s Spring! The Narcissi are standing tall and promising blossoms. The Croci are in flower. Data are sprouting in last year’s Superintendent’s Annual Report.

Table 8, “Number of Days Taught, ADA, ADM,” gives us an early measure of the impact of the pandemic-related shutdowns.

Richmond’s end-of-year count of days taught was 120, just two-thirds of the statutory minimum. Richmond’s total was one day more than those of Hampton and Newport News, three days more than Norfolk, 6.6 days short of the division average.

image

The highest in the state was Buckingham, 139 days; the lowest was Galax, 112.

Table 8 also shows end-of-year attendance data for the elementary and secondary schools. Richmond and all the peer cities managed to exacerbate the impact of the pandemic.

image

The division with the highest average attendance was Falls Church with 96.4%. Lowest was Petersburg, 91.3%.

These data do not give us a picture of the number of days or attendance of any on-line schooling.

Of course, these numbers are but a warmup for the 2020-2021 shutdowns.

It appears that this year’s SOL testing will be voluntary so we’ll have to wait until the summer of 2022 for a measure of the impact of all this.

Inexperienced & Out-Of-Field Teachers

Returning to the Download Data under the School Quality Profiles: Under “Teacher Quality” there is a “Teacher Quality” indicator. The download there provides 2020 state, division, and school percentages of inexperienced teachers, out-of-field teachers, and teachers who are both inexperienced and out-of-field.

We get to have our own views whether experience or in- or out-of-field status reflects on teaching quality; VBOE is the licensing agency and it thinks those measures matter.

For Richmond and the state, the data for all schools (i.e., both Title I and Not) look like this.

clip_image001

All the Richmond schools are listed as “High Poverty,” which is consonant with those zeroes for “Low Poverty” but raises the question why the “High Poverty” numbers are different from the “All Schools” values.

Note added on 2/12:

I asked VDOE about this discrepancy and they replied:

For these metrics, high poverty schools and low poverty schools represent the top and bottom quartiles of all schools in the state based on FRPL. So, when looking at the data by division, the high and low poverty school rates are calculated only on a subset of schools in that division. For Richmond specifically, there are 44 high poverty schools, 0 low poverty schools, and 6 schools with no poverty level (FRPL is not calculated).

In any case, the Richmond numbers all are high in comparison to the state averages. That situation persists in both the Title I and Not Title I schools.

clip_image001[6] clip_image001[8]

Turning to the Richmond elementary schools, here are the data, sorted by the inexperienced percentage:

image

Data for Patrick Henry are missing in the database.

The only non-Title I schools here are Munford, Fox, and Holton.

Next, the middle schools.

image

All the middle schools are Title I.

Finally, the high schools.

image

Note: Franklin also has middle school grades. Community, Open, Huguenot, and TJ are Title I.

Richmond Schools and Provisional Teacher Licenses

The Virginia regulation says

The Provisional License is a nonrenewable license valid for a period not to exceed three years issued to an individual who has allowable deficiencies for full licensure as set forth in this chapter.

VDOE says there is a link between teacher training and student learning:

Standards for teachers, administrators and other educators in Virginia’s public schools recognize the link between preparation and content knowledge and student achievement.

To the extent teacher training is linked to student achievement, larger numbers of provisional licensees reflect a problem with school quality because those licensees are teaching without the preparation required for a regular license.

The School Quality Profiles from VDOE have a download page. Under “Teacher Quality,” that page offers data, 2020 only, with counts of provisional licenses statewide, by division, and by school.

The database offers numbers for All Schools, High Poverty, and Low Poverty. The data further come subdivided for Title I, Non-Title I, and All Schools. The Feds tell us that school divisions

target the Title I funds they receive to schools with the highest percentages of children from low-income families. If a Title I school is operating a targeted assistance program, the school provides Title I services to children who are failing, or most at risk of failing

Here, then, are the Richmond and state data, expressed as percentages.

image

The database lists all the Richmond schools as High Poverty, but provides “All Schools” Richmond numbers that are slightly different from the “High Poverty” values. Go Figure.

In any case, the Richmond All Schools/All Schools number, 15.4%, is 2.26 times the state average. As to Non-Title I, Richmond’s percentage is 1.9 times the state average; for Title I (the great majority of Richmond schools), that ratio is 2.2.

Turning to the Richmond elementary schools, we see:

image

The Non-Title I schools here are Munford, Holton, and Fox.

Next, the middle schools.

image

River City is the renamed Elkhardt-Thompson. Indeed, the map on the VDOE Web site still lists the school by the former name.

All the middle schools are Title I.

Finally the high schools

image

The selective schools are colored green. Community, Open, Huguenot, and TJ are Title I.

Note: Franklin also has middle school grades.

To the extent that the Federal count in 2018 measures provisional licensees, it looks like the numbers improved between 2018 and 2020: 22.5% down to 15.4%. Stay tuned while I find out whether RPS will pony up the data by year, including this year, so we can all see the complete picture.

2020 (And Earlier) Dropouts from Richmond Schools

We have seen that Richmond’s public schools have a high 4-year cohort dropout rate among economically disadvantaged (“ED”) students and an appalling rate of dropouts among the more advantaged (“Not ED”) students.

Note: The #N/A entries represent cases where the numbers of students in the particular group is sufficiently small (<10) to trigger VDOE’s suppression rule.

The 2020 4-year cohort report includes data for schools with a graduating class. Here are those data for the Richmond schools, sorted by the school names, along with the state averages, .

image

As to the selective high schools, Franklin, Open, and Community, those numbers are what we might wish to see everywhere, zero.

Looking only at the mainstream high schools, here sorted by school names, we see a different situation.

clip_image001

Marshall and, especially, TJ report single digit rates. The dropout rates at the other three schools range from high to stratospheric, and all exhibit the counterintuitive inversion where the more affluent students dropped out at higher rates than the ED students.

Wythe holds last place with a 21% ED rate that is 3.1 times the state average and a 70% Not ED rate, 17 times the state average. The rates at Armstrong and Huguenot are perhaps less appalling but in any case plainly unacceptable.

The 2020 data are bound to be anomalous because of the pandemic so let’s look at some history.

But first a note on the data source: The (usually very helpful) VDOE cohort database access has been removed but the “Cohort Graduation Build-A-Table” is on the page. Clicking the link brings up a display that seems to offer the cohort dropout rates (along with the graduation rates) but if one has selected the (more honest) Federal graduation rate, the dropout data are absent. Select the (rigged) “On-Time”) graduation rate and you also can get dropout data. Go figure.

Here, then, are the recent dropout histories for Richmond and the state:

image

Two things jump out here:

  1. As we would expect, the statewide dropout of ED students is higher than for their more affluent peers; in contrast, the Richmond difference is reversed and is huge; and
  2. The COVID effect (alone or with some other cause) reduced the ED rate statewide by 1.4% (17% relative) but in Richmond by a whopping 7.6% (38% relative).

The RPS people will want to claim credit for the reduction in the ED rate but that explanation does not comport with the increase in Not ED dropouts. In any case, something weird is going on in Richmond. If you have it figured out, please share that information. The estimable Jim Bacon has one take on that.

Turning to the Richmond schools with graduating classes, we see the wonderful,

image

image

image

. . . and the awful (notice the different scales!),

image

image

image

. . . and the much less disturbing,

image

. . . ranging to right fine (in ‘20, at least).

image

As with the strange ED/Not ED inversions in all the mainstream high schools, these data do not clearly answer the question whether our new Superintendent has done anything useful about the dropout rate.

Counting Teacher Licenses: An Exegesis on Bureaucracy

An earlier post discussed the remarkably large number of unlicensed teachers in Richmond as reported in the 2018 USDoE Civil Rights Data Collection.

An email from the RPS Chief of Staff (added to that earlier post) responded that only four of about 2,100 Richmond teachers now are unlicensed, unless you also count 38 whose paperwork is hanging at VDOE because of COVID-related backups.

If true, that would show an astounding improvement in just three years. Unfortunately, it was not true, at least in the sense of the federal data.

The Feds count (pdf: “crdc-school-form”) as unlicensed all teachers

who did not meet all state licensing/certification requirements). Teachers working toward certification by way of alternative routes, or teachers with an emergency, temporary, or provisional credential are not considered to have met state requirements.

The Virginia regulation provides:

The Provisional License is a nonrenewable license valid for a period not to exceed three years issued to an individual who has allowable deficiencies for full licensure as set forth in this chapter. The Provisional License will be issued for a three-year validity period, with the exceptions of the Provisional (Career Switcher) License that will initially be issued for a one-year validity period and the Provisional Teach For America License issued for a two year validity period. Individuals shall complete all requirements for licensure, including passing all licensure assessments, for a renewable license within the validity period of the Provisional License.

But the Chief of Staff said in an email,

Provisionally licensed teachers count as licensed teachers by the VDOE (that was part of the business rules I had initially inquired on).  So that is not the same as the 4 or the 38 I referenced.

So, of course, the RPS numbers can be vastly different from the CRDC data because of VDOE’s and Richmond’s view of what “licensed” means.

We can get some insight into the difference by looking at the 2020 “School Quality Profile” (the only year available) on the VDOE Web site. There we see numbers of provisionally licensed teachers in RPS and the state:

image

That 2020 state number is 1.9 times the 2018 federal number of unlicensed teachers, while the 2020 Richmond value is 0.68 as large as that 2018 federal number.

image

To try to make some sense of the two approaches, we might consider the bureaucratic imperatives:

  • Federal: Grow the budget by finding “problems” that can be palliated with federal money. Thus, count all teachers who are not fully and finally licensed.
  • State: Have it both ways – look good (everybody licensed!) but show the need for more money (lots of provisional licenses). Thus, encourage Richmond to count the provisional licenses (in a sense, the learners’ permits) as licenses but also provide provisional license data for the division to use when talking to legislators.

See also the 2020 Annual Report of the Board of Education at pp. 20-21:

Like much of the nation, Virginia continues to face a shortage of quality educators entering and remaining in Virginia’s public schools. This decline is correlated with low teacher salaries and lack of commitment to tap financial resources to correct this crucial situation. Teacher vacancies are found in every region of the Commonwealth, but are not distributed evenly. The number of unfilled positions increased from 440 during the 2010-2011 school year to a height of 1,081 in the 2016-2017 school year, then dropped slightly in the 2017-2018 and 2018-2019 school year. In the 2019-2020 school year, the number went up to 1,063 (Chart II). The percent of provisionally licensed and inexperienced teachers has similarly climbed. This shortage has reached emergency levels in many high poverty school divisions that do not have the resources to compete with other school divisions

Make what you will of this. I think it speaks to the need to provide complete and detailed data to the taxpayers who are funding the education establishment.  For a start, RPS might pony up the complete counts for all classifications of its teachers, licensed vel non, for 2018-2020, so we can see what their situation really is.