2-Year to 4-Year Pipeline Clogged?

Note added 4/11/22: Whoops! Tod Massa confirms that there is a problem with the JMU and Radford data that they are working to correct. As well, there looks to be another problem with my pivot table, so my numbers may not be correct. So, mark this post up as perhaps interesting but certainly wrong. I’ll try again after SCHEV works out the Radford/JMU issues.

Note added 4/12: Corrected a small problem in the first 5 graphs. Even so, in light of the data problems at JMU and Radford, the numbers for those schools and the totals are wrong.

While wandering through the data on the SCHEV Web site, I came upon Table TR01, Trends in Transfer from Two-Year Institutions. The table lets one select a 2-year institution (all but one look to be community colleges), a 4-year institution, a “gender,” (they list only men & women; presumably they refer to the biological construct not the grammatical), a student group (“majority” students or “students of color”), and a sub-cohort of transfer students (ranging from all new transfers to students earning fewer then 6 credits in the first year).

Trouble is, the Web page provides only a limited snapshot, e.g.:

image

Any attempt to break out those data further would require downloading a flock of spreadsheets. However, the ever helpful Tod Massa of SCHEV kindly sent me a spreadsheet with the entire dataset (whew!). Indeed, his spreadsheet improves on the posted data, reaching back an extra two years to 2011 and forward a year to 2021.

The spreadsheet also includes data for Richard Bland College, albeit that school is not listed as a Virginia Community College.

Here are the results of my initial prodding of those data.

The count of transfers from all 2-Year colleges to public 4-year programs rose to a peak in 2017 and then declined monotonically, ending down by 14.8% in 2021.

image

The largest decreases were at Radford and JMU, with Norfolk State also leading the pack. The largest gains were at VMI and THE University.

image

The data by receiving institution show some interesting patterns (broken out here by relative overall size of the number of transfers). Notice the 2021 plunges at JMU and Radford.

image

image

image

The transfers to all public and non-profit 4-year programs show a 20.9% drop since 2017.

image

The sending 2-Year institutions all show declines since 2017.

image

image

image

The largest decreases, ‘17 to ‘21, were at Lancaster, Wytheville, and Blue Ridge. The smallest, Camp and Tyler.

image

To the extent that a major function of the 2-Year colleges is a low-cost path to a four-year degree, the demand for that path has been decreasing. These data do not reflect the level of participation in the ordinary 2-year programs or special programs such as FastForward (short-term training courses) and G3 (low-income students).

Dropped Out 2021

It’s Spring! In Richmond, the daffodills are blooming and the 2020-2021 Superintendent’s Annual Report is sprouting data.

The Report has the dropout and Fall enrollment data by division.  Juxtaposing those, we can calculate the division percentages of dropouts.

image

Richmond with 1091 504 dropouts, 1.85% of the Fall enrollment, is the gold bar.  The red bars are the peer cities, from the left Hampton, Newport News, and Norfolk. The blue bar is the State total. (Oops! the 1091 is a total of totals. The correct Richmond number is 504. The percentage is correct, 1.85.  H/t to a reader of Carol’s repost.)

The yellow bar at 0.86% is Lynchburg, with a hat-tip to a reader (the reader?). The two empty slots at the bottom are Bath and Highland, both with zero dropouts.

Here are the data:

Division Name

% DO

Accomack County  0.72%
Albemarle County  0.40%
Alexandria City  0.65%
Alleghany County  0.68%
Amelia County  0.45%
Amherst County  0.43%
Appomattox County  0.27%
Arlington County  0.38%
Augusta County  0.32%
Bath County  0.00%
Bedford County  0.49%
Bland County  0.61%
Botetourt County  0.28%
Bristol City  0.56%
Brunswick County  1.06%
Buchanan County  0.17%
Buckingham County  0.26%
Buena Vista City  0.91%
Campbell County  0.33%
Caroline County  0.40%
Carroll County  0.21%
Charles City County  0.72%
Charlotte County  0.18%
Charlottesville City  0.84%
Chesapeake City  0.52%
Chesterfield County  0.69%
Clarke County  0.06%
Colonial Beach  0.17%
Colonial Heights City  0.37%
Covington City  0.10%
Craig County  0.56%
Culpeper County  0.99%
Cumberland County  0.69%
Danville City  1.25%
Dickenson County  0.16%
Dinwiddie County  0.43%
Essex County  0.34%
Fairfax County  0.64%
Falls Church City  0.20%
Fauquier County  0.32%
Floyd County  0.11%
Fluvanna County  0.65%
Franklin City  0.92%
Franklin County  0.78%
Frederick County  0.65%
Fredericksburg City  1.44%
Galax City  0.71%
Giles County  0.18%
Gloucester County  0.53%
Goochland County  0.20%
Grayson County  0.07%
Greene County  0.28%
Greensville County  0.54%
Halifax County  0.70%
Hampton City  0.25%
Hanover County  0.28%
Harrisonburg City  1.05%
Henrico County  0.91%
Henry County  0.42%
Highland County  0.00%
Hopewell City  1.48%
Isle of Wight County  0.38%
King George County  0.85%
King William County  0.65%
King and Queen County  0.24%
Lancaster County  0.92%
Lee County  0.65%
Lexington City  1.08%
Loudoun County  0.19%
Louisa County  0.54%
Lunenburg County  0.39%
Lynchburg City  0.86%
Madison County  0.06%
Manassas City  0.84%
Manassas Park City  1.20%
Martinsville City  0.55%
Mathews County  1.09%
Mecklenburg County  0.74%
Middlesex County  0.09%
Montgomery County  0.52%
Nelson County  0.44%
New Kent County  0.07%
Newport News City  0.61%
Norfolk City  1.11%
Northampton County  0.81%
Northumberland County  0.17%
Norton City  0.48%
Nottoway County  0.95%
Orange County  0.19%
Page County  0.13%
Patrick County  0.42%
Petersburg City  0.85%
Pittsylvania County  0.43%
Poquoson City  0.34%
Portsmouth City  0.79%
Powhatan County  0.12%
Prince Edward County  1.49%
Prince George County  0.47%
Prince William County  0.62%
Pulaski County  0.31%
Radford City  0.12%
Rappahannock County  0.68%
Richmond City  1.85%
Richmond County  0.24%
Roanoke City  1.10%
Roanoke County  0.38%
Rockbridge County  1.02%
Rockingham County  0.40%
Russell County  0.62%
Salem City  0.27%
Scott County  0.39%
Shenandoah County  0.32%
Smyth County  0.15%
Southampton County  0.68%
Spotsylvania County  0.77%
Stafford County  0.45%
Staunton City  0.99%
Suffolk City  0.60%
Surry County  0.62%
Sussex County  0.10%
Tazewell County  0.36%
Virginia Beach City  0.32%
Warren County  0.34%
Washington County  0.41%
Waynesboro City  0.43%
West Point  0.38%
Westmoreland County  0.20%
Williamsburg-James City County  0.44%
Winchester City  1.13%
Wise County  0.11%
Wythe County  0.21%
York County  0.19%
State Totals 0.58%

Crime Report for Forest Hill Neighborhood

The 2021 data in the RPD database have had a chance to settle down and it’s time to update the Forest Hill crime report data.

The City defines the neighborhood to run from Forest Hill Park to the Boulevard and from Forest Hill Ave. to the river.

As you see, this does not include all of the Forest Hill Neighborhood Ass’n area and does include some of the Westover Hills Neighborhood Ass’n area.  It includes only one side of Forest Hill Ave, i.e., only one side of the Westover Hills Blvd. commercial area.

Micro$oft has a nice satellite view of the area.

For the period from the start of the Database, January 1, 2000, through December 31, 2021, that database contains 2334 offense reports for the neighborhood. Among those entries, “theft from motor vehicle” is the most common at 26%.

image

I like to call those incidents “car breakins” but that is not accurate. The count of “destruction property/private property” reports with the same incident number as a theft from MV, 51, suggests of only about 8.3% of the thefts were breakins. Most were cases where the owner left the car unlocked. “Abandoned property in vehicle” might be more accurate.

As usual in a quiet neighborhood, most of the incidents involve property crime.  In the present case, the most frequent violent crime is simple assault, in tenth place behind 68% of the total (ninth place, 66%, if we don’t count the 64 natural deaths).

The neighborhood was enjoying a (fairly) consistent pattern of improvement until 2015.

image

The increases then were largely driven by theft from motor vehicle (and associated entertainments enjoyed by the criminals chummed into our neighborhood by the goodies left in parked cars).

More recently, things have been back on a path to improvement.

image

By far our worst block in terms of total offense reports is 4200 Riverside Dr.

image

Note: In the Distant Past, RPD reported offenses by street address, except for family issues such as incest. They now report only by the block, e.g., “42XX Riverside Dr.” The Freedom of Information Act gives them room to be about as unhelpful as they like and they clearly think being unhelpful in this respect is more important that being transparent.

The 4200 block is home to the 42d St. Parking Lot (and lots of on-street parking for the Park).

image

Forty-nine percent of the crime reported in that block is theft from motor vehicle, with second place going to property destruction. The data suggest the 22 of the 59 “destruction” incidents were car breakins, leaving about 157 thefts where the car was unlocked.

image

No telling how much of the rest is spillover from the criminals lured into our neighborhood by the cars with valuable property left on the seats.

The earlier decreases in the 4200 block came after Parks’ 2005 response to neighborhood complaints: They started locking the gates to the 42d. St. lot at night and off season and they installed rocks to block parking in the part of the lot that is less visible from the street.

image

I attribute the recent increases to the increased use of the Park, the removal of the rocks in 2016, and the reassignment of Stacy, the bicycle cop.

Time will tell whether the recent improvement is fruit of the pandemic. For sure, the improvements reflected those in the thefts from vehicles.

image

Aside from 4400 Forest Hill (probably driven by the nursing home) and 4700-4800 Forest Hill (the commercial area), the other blocks at the top of the list lead with (for sure, Park-related) theft from motor vehicle.

image

image

There are at least two lessons here:

  • Leaving stuff in the car, especially in an unlocked car, is an invitation to lose the stuff and to help chum the neighborhood for criminals; and
  • Given that almost all of the thefts are from the vehicles of park visitors (most of them naïve County residents), and that this crime is entirely preventable, it’s two decades past time for some LARGE signs in the 4200 block and at the Nature Center and 4100 Hillcrest and, especially, in and near the 42d St. parking lot, to warn the visitors.

Somehow, the City manages to post such signs at Maymont but not in Forest Hill.

080112 007

Another side to the story: A contact at RPD whose opinions I value suggests that signs don’t do much good. I still think they are worth a try: They are inexpensive, especially when compared to the costs of, e.g., increased policing.

Note added on January 30: We took a walk down to the 42d St. lot on this chilly Sunday and spotted this:

PXL_20220130_205849740

The location is not good: It doesn’t command the entrance to the path. But it’s a small step in the right direction.


Harvesting Us Oldies

VPAP has some interesting COVID data, apparently derived from the VDH data.

A first glance at the chart by age for the last fourteen days (through 1/11) emphasizes the increasing percentage of deaths were among people at or above age 60.

image

Recasting those data reemphasizes that first impression.

image

In short, 41.9% of COVID deaths in that two-week period were people 80 or above; 85.7% were 60 or above.

The estimable Jim Bacon points out that it might be useful to view the case, etc. data as percentages of the age group populations. The Health Dept. and Census have those numbers (here, COVID totals through 1/12, not just the last two weeks).

image

Yep. It’s those youngsters (many of them unvaccinated) who are catching the disease. But the little wretches are not suffering the consequences at anything like the rate of us older folks.

Let’s take the case numbers off the graph so we can see the hospitalization and death data in more detail.

image

Indeed. Cast the data as you will, it’s clear that being old mature is dangerous.

Counting Parents

The Annie E. Casey Foundation’s Kids Count Web site includes Virginia household single-parent percentages by race and by Hispanic culture or origin. Juxtaposing those data for calendar 2010 to 2018 with the SOL pass rates for 2010-2011 through 2018-2019 produces some interesting graphs.

Note: The VDOE database offers data for students who are, or are not, “economically disadvantaged” (here abbreviated ED and Not ED). The criteria are largely driven by eligibility for free or reduced price lunches. The ED/Not ED distinction is interesting in the SOL context because ED students generally underperform their more affluent peers by some fifteen to twenty points, depending on the subject. So let’s look at the data for both ED and Not ED students. The ideal dataset would provide single-parent counts for both economic groups. The Kids Count data are totals for each race or culture so we’ll make do with that limitation.

To start, here are the reading pass rates over the nine-year period for Not ED Asian, Black, Hispanic, and white students v. % single-parent homes for ED + Not ED members of each group.

image

Of course, correlation does not tell us about causation unless we can control for all the other possible variables. Here, whatever the underlying cause(s) may be, the performance differences of the four groups are clear and clearly related to the percentages of single-parent households.

For all four groups, the outlying pair of higher pass rates are for AY 2011 and 2012, the two years in these data prior to new, tougher reading tests.

The data for the ED students show a similar pattern but with lower pass rates and larger decreases after the deployment of the newer tests.

image

The math data paint similar pictures, albeit the new math tests came a year earlier so there is, at most, only a single high outlier in each group.

image

image

We can get another view of the relationship between pass rates and single-parent homes by looking at the averages over the 9-year span.

image

Again, these data cannot tell whether the lack of a second parent causes these effects on performance.  But they do imply a strong relationship between the number of parents in the home and SOL performance.

The ED/Not ED differences in these averages range from 9.2 to 18.1 points.

image

Per subject, the differences in the averages appear to increase slightly with increasing percentages of single-parent homes.

image

Where Have All The Students Gone?

The estimable Jim Bacon recently posted on declining enrollments in many public schools. He used VDOE data comparing the fall division enrollments (aka division “memberships”) in 2021 with those in 2019. Those data showed the largest drops in “rural, non-metropolitan” areas.

Of course, most Virginia school divisions are in such areas.

Highland County, with the smallest enrollment in Virginia, provides an extreme example. Here are their enrollments, back to the fall of 2011, compared to the state totals. Enrollments are presented as percentages of 2019 in order to fit the disparate enrollments onto the same graph.

image

Notes: The spreadsheet Jim relied upon reports the data by school year, e.g., 2019-2020. These being fall enrollments, I just use the fall dates, e.g., 2019. The (least squares) fitted lines in the graphs here use the data from 2011 through 2019 and are extrapolated to 2021.

The state data showed a 3.6% decrease from 2019 to 2021.  Before 2020 the state enrollment showed a consistent increase; if we extrapolate that trend to 2021, the difference between the extrapolated value and the 2021 number is –4.4% relative to the extrapolated number.

The scatter in the Highland data is to be expected, of course; such small enrollments (178 in 2021) will vary from year to year. The 2021 number is –18.3% v. 2019 and –16.0% v. the extrapolated value.

The R-squared value on the Highland County pre-2020 data, 2.4%, tells us to disdain any correlation. Even so, the pattern of the 2021 and 2022 numbers suggests a persistent decrease, but don’t bet any real money on how large it might actually be.

The comparison of 2021 numbers to the extrapolated enrollments jumps Richmond to the top of the list (17.6% decrease), thanks to its increasing enrollments pre-2020.

image

The 2020 datum, however, suggests something unusual is going on in Richmond. The modest scatter before ‘20 lends some credence to the extrapolation but the huge difference between the ‘20 and ‘21 data counsels caution about any conclusions as to 2021.

With a dose of skepticism, then, here are the nineteen largest decreases:

image

The highlighted divisions are on Jim’s list of the 19 biggest losers.

Notice that the preponderance of “rural, non-metropolitan” divisions has disappeared. Indeed, Alexandria, Arlington, and Loudoun have joined the list, along with Falls Church, Manassas, and Manassas Park (and Fairfax is just four below Franklin County).  Charles City, Craig, Surry, Sussex, Buchanan, Cumberland, Buckingham, Greensville, Nottoway, Lee, and Franklin City from Jim’s are absent here.

Alexandria’s state-sized enrollment decrease after 2019 is much more interesting in the context of the consistently rapid increases before 2020.

image

The decrease from 2019 to 2021 is 3.8% but the decrease from the extrapolated value is a much more dramatic 9.9%

Arlington shows a similar pattern.

image

As does Loudoun.

image

Although just off the list, our largest division, also a part of the NoVa complex, shows a milder version of the same pattern.

image

Alexandria, Arlington, Loudoun, and Fairfax enrolled 24% of the state total in 2021. All four closely mirrored the state decreases in ‘20 and ‘21; all their decreases were much larger, however, in light of their pre-2020 increases.

BTW, Falls Church, although much smaller, shows a similar pattern.

image

And Prince William, with a 5.3% decrease, shares the pattern.

image

Clearly, something changed in those suburban DC divisions, reversing histories of increases. The similar patterns suggest similar forces working to increase and then reduce the enrollments. The increases probably were driven by population growth. These data do not confirm that, of course. To the point here, they do not tell us what might have caused the more recent decreases.

Turning to the other end of the list, here are the big gainers from Jim’s post.

As Jim noted, these are mostly “rural, non-metropolitan” areas.

And here are the extrapolated top sixteen.

image

The highlighted divisions are in Jim’s top-sixteen. Except for Hampton, the divisions here also are, again, mostly “rural, non-metropolitan” areas.

Radford has seen a huge relative increase since 2019.

image

The actual numbers are 2019 membership of 1,642 vs. 2,701 in 2021. Small numbers, large relative change.

Giles is similar.

image

Essex reversed a declining enrollment.

image

The larger, urban division in this list is Hampton (nearly 20,000 students), where the 2021 increase runs counter to a pattern of decreases.

image

None of these patterns looks like those of the NoVa divisions.

One possible factor in the many enrollment decreases is increases is home schooling and exercises of the religious exemption. Matt Hurt has posted data showing increases of home schooling and/or religious exemption counts from 2020 to 2021. Most of those were in the lower grades.

The VDOE data show a state-average 40% increase in the Home+Religious total from 2019 to 2021. Comparing the division totals with the calculated 2021 enrollments gives a measure of the potential H+R effect on the calculated enrollment change.

image

Statewide, the H+R change was 30.9% of the enrollment change. In Highland, the H+R change was 67.7%. In the NoVa jurisdictions, the change ranged from 13.4% in Loudoun to 2.2% in Arlington. Prince William was –24.2%. We can infer that increases in home schooling and religious exceptions were major factors in some enrollment decreases, but, with the possible exception of Prince William, not in those large decreases in NoVa.

We don’t have data to tease out the effects of increases in private schooling, departures from the jurisdiction or other potential causes of the post-2019 enrollment decreases.

CIP Lights the Way

Amidst the abiding failures of the Board of “Education” to repair the (mostly urban) schools that are damaging our children (the glaring example is the Board’s sixteen-plus year misfeasance as to the disaster in Petersburg), some systems have improved despite the Board’s incompetence.

Beginning in 2016, the school divisions in Southwest Virginia organized a consortium that applied four basic notions:

  • Identify the most successful teachers;
  • Share their instructional materials and best practices;
  • Set high expectations; and
  • Measure what works.

Please notice that “Spend more money” is not on the list.

They call their effort the “Comprehensive Instructional Program, aka the CIP. The Director of the CIP, Matt Hurt, recently authored six blog posts (I, II, III, IV, V, and VI) that explain the CIP and, in the process, illuminate some of the counterproductive actions of the State Board.

See this post for a nice summary of the CIP.

Most remarkably, the CIP has succeeded in its effort to reach those students who most need good teaching and who are hardest to teach, the less affluent.

The SOL data show that economically disadvantaged (“ED”) students (basically those eligible for free/reduced-price lunches), as a group, underperform their more affluent peers (“Not ED”) by about 20 points, depending on the subject. Those same data show the effect of the CIP in the founding divisions, those in Region 7 (far southwest Virginia).

Let’s start with the reading pass rate average of the Region 7 divisions and the state average pass rate, by year, for both ED and Not ED students.

image

Notice the consistent improvement of the ED rate versus the state, culminating in a 12.1 point difference in 2021.

image

The math data show an even more dramatic ED improvement resulting in a 15.2 point difference in 2021.

image

image

Compare those results to the Petersburg debacle:

image

image

image

image

I think our new Governor should, on his first day, fire the members of the Board of “Education,” replace them with people recommended by Matt Hurt, and appoint Hurt as Secretary of Education.

WSJ College Rankings

The Wednesday edition of the Wall Street Journal contains, behind the paywall, their annual rankings of “nearly 800 U.S. colleges and universities.”

The Top Ten are Harvard, Stanford, MIT, Yale, Duke, Brown, CalTech, and Princeton, with Johns Hopkins and Northwestern tied for tenth.

The top Virginia schools are:

image

The WSJ explains their methodology:

Rankings are based on 15 key indicators that assess colleges in four areas: Outcomes, Resources, Engagement and Environment. Outcomes accounts for 40% of the weighting and measures things like the salary graduates earn and the debt burden they accrue. Resources, with a 30% weighting, is mainly a proxy for the spending schools put into instruction and student services. Engagement, drawn mostly from a student survey and with a 20% weight, examines views on things like teaching and interactions with faculty and other students. Environment, at 10%, assesses the diversity of the university community.

The Web version of the article has a helpful tool that lets us look into the rankings of Virginia’s schools.

image

They don’t list Virginia State.

One major reason for attending college, of course, is that college graduates make more money. (“Most 4-year schools deliver a 15 percent return on investment – double that of the stock market.”)

Plotting the ranking v. outcome data for the thirteen Virginia schools ranked above 400, we see:

image

The three “400” outcomes there are “>400” in the data, i.e., all three would be somewhere off the left side of the graph if the rankings broke out the >400 group.

The dashed line denotes ranking = outcome. Schools above that line received rankings better than their outcomes; schools below, rankings lower.  For example, THE University is in the higher outcome group with a 30th place outcome but 55th rank.

JMU is the outlier here with an outcome of 106 but a ranking of only 286. No telling how far HSC, VCU, and EMU lie from the line.

Mary Washington is interesting but does not show on the graph: Ranking is somewhere in the 500’s but outcome = 325.

Turning to ranking v. cost, we see, again for the top thirteen:

image

Finally, outcome v. cost (again with the three >400 outcome entries plotted at 400 but actually ranked lower, perhaps much lower, than 400).

image

2021 SOL Results, First Cut

2020 was the first spring since 1998 without SOL tests in Virginia. Then came 2021, when participation in the testing was voluntary.

The VDOE press release says, “[2020-21] was not a normal school year for students and teachers, in Virginia or elsewhere, so making comparisons with prior years would be inappropriate.” The first line of the very next paragraph of the press release then quotes the Superintendent making a comparison: “Virginia’s 2020-2021 SOL test scores tell us what we already knew—students need to be in the classroom without disruption to learn effectively.”

Let’s look at some data and see whether they offer any principled implications.

But first: As we have seen, economically disadvantaged students (“ED”) underperform their more affluent peers (“Not ED”) by around twenty points, depending on the test. This renders the school and division and state averages meaningless because of the varying percentages of ED students. Fortunately, the VDOE database offers data for both groups. Hence the more complicated analyses below.

To start, let’s look at the numbers of students tested by year for reading in Richmond and statewide.

image

As we might expect, the ED tested counts dropped in 2021 even more precipitously than the Not ED. In Richmond, the Not ED decrease was about double the state average, the ED, nearly three-fold.

image

The pass rates (of those fewer students) fell, relative to 2019, but not as far as we might have feared.

image

The Richmond ED score drop was nearly twice the state average; the Not ED, a bit short of 1.5 times.

image

Note, however, that even a 6.7 point drop in the pass rate is huge.

We can only speculate about the effects of the various factors that might lie beneath these numbers. Those factors might include:

  • Quality of the online instruction,
  • The students’ capability to learn online,
  • Students’ efforts in the presence of an online teacher,
  • Parental desire to see results for their children,
  • Parental opposition to SOL testing,
  • Parental concern for COVID exposure during the testing, and
  • Relaxed graduation requirements.

The ED/Not ED difference increased, both in Richmond and Virginia.

image

The math counts show decreases similar to the reading.

image

image

Those who took the math tests, both ED and Not, did not do well.

image

image

And, again, the Not ED/ED gap of those tested in ‘21 increased.

image

The H&SS test counts were so low that the pass rates must be close relatives of meaningless.

image

image

Hmmm. Even so, the Richmond Not Ed pass rate edged above the state average.

The science counts were close to the math numbers.

image

image

The writing data painted yet another messy picture.

image

image

Note the increase in the Richmond ED rate. Looks like that 13.3% were a special group.

There is one clear inference available from these data: The test results may well have been helpful to individual students but making the SOL testing voluntary made the collected results meaningless. Those collected results do not even give any measure of “what we already knew,” although they are not inconsistent with the notion that the online instruction did not work well. We’ll have to wait for the 2022 results to get a clearer idea of the COVID effects.

That’s enough for one post. It will take some time to mine the information in this year’s data; if you’re interested, please stay tuned.

College Education: Debt and Income

The Wall Street Journal  has a piece (behind a paywall) comparing student debt and subsequent incomes for a number of college programs. A calculator near the bottom of the page allows the reader to select the degree and then any college(s) to emphasize on a graph. The data are ratios of median debt for graduates in “roughly” 2014-15 to median income two years later.

Caveat: Data are only for students with federal student loans and may not be representative of results for students who did not take out such loans. As well, the data are medians and do not capture the experiences of students who borrowed or earned much more or less.

To start, here is the graph for medical schools, with data for seven selected institutions:

image

Unfortunately for clarity here, the Journal sorts by school name, so we have to look at colors and ratios to find the selected schools on the graph.

The graph shows the ratio of debt to income at each school; the table below the graph shows those ratios as well as the median debt and income for the selected schools. The vertical axis merely expands to allow points for multiple schools in a given debt/income range; position in that direction is irrelevant.

I selected Harvard, Stanford, and Columbia here because of their position at the low ratio end of the list. Otherwise, my selections here and below have “Virginia” in the school name or were spotted by my scan of the list.

We might expect graduates of prestigious schools such as Harvard, Stanford, and Columbia to enjoy relatively larger incomes; that is the case here but the low ratios also reflect lower debt levels.

UVa is the best of the Virginia batch, with a 2.19(!) debt to income ratio.

With a little help from Professor Bill Gates, we can recast the data as a plot of income v. debt.

image

The upper left corner there is where you’d want your kid.

The two outliers here, University of Puerto Rico-Medical Sciences and Southwest College of Naturopathic Medicine & Health Sciences, compress the rest of the graph. Expanding the income axis to remove those two gives a more detailed picture.

image

Here the relatively higher incomes and lower debt of Harvard et al. stand out. Likewise the more modest and roughly similar incomes and higher to much higher debt levels at the Virginia schools other than UVa.

We have to wonder whether these modest (for doctors) incomes reflect the going rates for interns and not for doctors out in practice. The WSJ  article does not contain the word “intern.”

Turning to law schools we see:

image

Here, THE University is playing with the Big Guys, with hefty incomes offsetting the relatively high debt level.

image

Perhaps interestingly, the bulk of the law graduates here see incomes in the vicinity of $60,000, whatever their debt levels.

In contrast to the case of medical schools, most law graduates go to jobs, not internships.

In any event, these and the med school data might serve to raise questions about Willie and Waylon’s advice to mothers to “let [your babies] be doctors and lawyers and such.”

Likewise dentistry (but see VCU beat out Harvard on both the ratio and incomes(!)).

image

image

Some (many?) dentistry graduates take multi-year residencies so, as with the doctors, these data may paint a distorted picture.

Turning to bachelor’s degrees, those in “Business/Commerce General” paint a much prettier picture.

image

We probably should wonder about these incomes: Many of the better students in bachelor’s programs go on to graduate school and won’t show in the WSJ  data.

In any event, Chemical Engineering.

image

And Chemistry.

image

Computer Science.

image

Criminal Justice and Corrections.

image

English Language and Literature, General

image

Fine and Studio Arts.

Health and Physical Education/Fitness

image

Mathematics:

image

Music:

image

Psychology, General.

image

Note: Eight “Virginia” colleges are in the overall list (looks like nearly everybody teaches psychology) but VCU, VPI, and Wesleyan are off scale on the graph because of the crowd.

As noted above, these data, especially the medical, dental, and undergraduate numbers, need to be viewed with some skepticism. In any case, they point to an important consideration. For sure, if I had a kid thinking of college, I’d get out the computer and start a discussion about expectations.