Graduation and Not

VDOE posted the 2015 4-year cohort graduation data yesterday.  Their press release burbled on about the increase of the On-Time rate to over 90%. 

As we shall see, the On-Time rate is a fiction, created by VDOE to inflate the rate.  But first, some background.

  • The Standard Diploma requires twenty-two “standard credits” and six “verified credits” in specified subjects.  
  • The Advanced Studies Diploma requires twenty-four standard and nine verified credits. 

These are the only diplomas recognized by the Feds for calculation of the federal graduation indicator.  VDOE counts three further diplomas toward its inflated “On-Time” graduation rate:

  • The Modified Standard Diploma is available to students “who have a disability and are unlikely to meet the credit requirements for a Standard Diploma.”  This diploma is being phased out in favor of “credit accommodations” that will allow students who would have pursued a Modified Standard Diploma to earn a Standard Diploma.  Those of us who have watched the wholesale institutional cheating via the VGLA may be forgiven for thinking that these accommodations will be a fertile field for schools and divisions to game the system.
  • The Special Diploma, now known by the new euphemism “Applied Studies Diploma,” “is available to students with disabilities who complete the requirements of their Individualized Education Program (IEP) and who do not meet the requirements of for other diplomas.”
  • The General Achievement Diploma “is intended for individuals who are at least 18 years of age and not enrolled in public school or not otherwise meeting the compulsory school attendance requirements set forth in the Code of Virginia.”

I have commented elsewhere on Richmond’s abuse of the process for identifying and testing kids with disabilities.

This year, the 4-year cohort On-Time rate for Virginia was 90.5%.  The federal graduation indicator, known here as the “actual” graduation rate, was 86.7%. 

In some divisions, notably Richmond, the difference was larger than the statewide 3.8%:

 image

Richmond’s actual rate is lowered because of the extraordinary numbers of Modified Standard and Special diplomas.

image

Note also the relative paucity of Advanced diplomas in Richmond.

As we see here, both the actual and bogus Richmond rates were considerably lower than the corresponding state rates.  This continues an established trend:

image

Something like half of the students who do not receive a 4-year diploma of any sort drop out.  (Others hang on and drop out later or graduate in 5 or 6 years).

image

It does not take much imagination to conclude that some of the Richmond cohort’s 167 dropouts might turn into people one would prefer to not meet in a dark alley.

The VDOE data also allow a (limited, as we shall see) look at the graduation and dropout rates per school. 

image

None of the mainstream high schools has an actual graduation rate as high as the state average, albeit TJ managed to beat the state On-Time rate.

There are no data for Maggie Walker here because it is not a Richmond Public School.  There are no data for Walker elsewhere because VDOE cooks the books, as with the SOL scores, by assigning the Walker students’ data to the public high schools in those students home districts.  Thus, the true graduation rates of the mainstream high schools, both actual and on-time, are even lower than reported.

Finally, here are the cohort dropout rates by school.

image

Preliminary Graduation Data

VDOE is out today with a press release bragging on the increased On-Time graduation rate

The VDOE Web site was down today until suppertime; I will not have time to analyze the data until tomorrow.  Until then, here are some early data (“Actual” rate refers to the advanced+standard diploma rate; “on-time” includes counts of those diplomas plus the modified standard, special, and general achievement  diplomas):

image

image

If at First You Don’t Succeed, Try Something Easier

VDOE adopted the SGP in 2001 to have a performance measure that is not dependent on the economic status of the student.  The SGP turned out to be too useful: It allowed the identification of the truly lousy teachers, e.g., the statewide 2014 distributions by teacher

image

juxtaposed with the 6th grade reading distribution for Richmond teachers

image

Notice: twenty of twenty-one below the state average.  Beyond that, how would you like to have your kid in the hands of that teacher whose students performed in the 13th percentile?

So VDOE is abandoning the SGP for a new measure, the “Progress Table.”  Their “reason”:

SGPs must be calculated each year, and the calculations cannot be prepared until all statewide data are available. This requirement has resulted in growth information not being available to school districts until the early fall of the next school year.

But VDOE designed the SGP process knowing that the summer testing would delay the results until Fall.  On this evidence we must conclude that either

  1. They are stupid, or
  2. They are lying.

And we know they are not stupid.

We can speculate about the reasons for this mendacious change of course.  My own guess is political pressure from the teachers.

For sure, VDOE is under other pressure for having illuminated poor performance.  Following the precipitous drop in SOL scores with the new math test in 2012 and the new reading test in 2013,

image

the accreditation numbers plunged, despite VDOE’s elaborate manipulation of the data.

This year the General Assembly responded by passing identical bills from both the House and Senate:

The Board of Education shall promulgate regulations establishing additional accreditation ratings that recognize the progress of schools that do not meet accreditation benchmarks but have significantly improved their pass rates, are within specified ranges of benchmarks, or have demonstrated significant growth for the majority of their students. The Board shall implement such regulations no later than the 2016-2017 school year.

Consonant with these mandates, the new Progress Table process provides a mechanism to reward improved SOL scores and neglect declining scores.

In the current calculation of the accreditation pass rate, a passing student counts as “one passer” in the numerator of the pass rate. In order to account for students working toward demonstrating proficiency, partial credit could be awarded for sub-level growth.

Note, partial credit for sub-level growth but no penalty for sub-level decline!

As Omar Cranky wrote earlier:

The moving target moves; and having moved,

Moves on:  nor all thy piety nor wit

Shall lure it back to give an honest answer

Nor all thy tears wash away the bureaucrats’ obfuscation.

But, until they take away the SOL database, we’ll have the ability to calculate real progress by subject and grade for each division and school.  Stay tuned for an example.

How About Those Salaries?

As a further look at the effect of expenditures on performance, here are the 2015 division average reading pass rates v. the 2015 division average budgeted teacher salaries (the actual 2015 salary data won’t be available until around the first of the year).  Data for Accomack are missing for lack of a report to VDOE.

image

Richmond is the gold square; the blue circle is the statewide division average.

The fitted curve suggests that an additional $10,000 average salary is associated with a 2% increase in the pass rate but the R2 tells us that the two variables are essentially uncorrelated.

The math data paint a similar picture.

image

Of course, we know that increasing economic disadvantage of the student population is associated with lower pass rates.  We can account for the average effect by using the correlation between pass rate and economic disadvantage to normalize the pass rates, i.e., express the pass rates as percentages of the economic disadvantage trendline rates.   That produces these graphs:

image

image

Again, only minuscule correlations.  And the fitted curves, to the extent they mean anything, say “no benefit from the higher salaries.”

So it seems that the divisions that pay their teachers more do not get better SOL performance; they merely pay more for the performance they get.

Finally, here for two of my faithful readers (maybe the only two) are the last two graphs showing the results for Charles City (purple circle) and Lynchburg (red circle).

image

image

Data are posted here.

Ranking Richmond

Jim Weigand sent along his analysis of the history of Richmond’s statewide rank among the 132 divisions on his five-subject SOL pass rate average.  Here are the data.  The black line is the lowest possible rank: 132d out of 132 divisions.

image

EXTRA CREDIT QUESTION: An “A” to the reader who can tell me how to get Excel 2010 to reverse the Ordinate, as here, but get the hash marks to go to the bottom, rather than stay at the top, as here.

Money Won’t Make Them Learn More

Jim Weigand of Lynchburg has an alternative to my data suggesting that spending more money doesn’t improve SOL scores.  Jim averages five subject division pass rates and compares them to the Excess above the  Required Local Effort for the Standards of Quality.  The 2015 RLE data are not posted yet so he uses the 2014 data for both quantities.  His result:

image

The fitted curve suggests that doubling the local effort raises the average pass rate by 2.7% but the R2 of 3.4% tells that the two variables are essentially uncorrelated. 

Here are Jim’s data:

2014 Data

5 Subject Avg. SOL

Rank

Excess RLE

Rank

Difference in ranking

Albemarle County

80.17%

25

140.25%

12

-12

Alexandria City

68.76%

109

183.58%

4

-104

Alleghany County

70.25%

104

180.40%

5

-98

Amelia County

73.02%

88

44.58%

106

19

Amherst County

75.47%

67

94.26%

51

-15

Appomattox County

77.00%

49

15.34%

127

79

Arlington County

84.12%

10

193.87%

3

-6

Augusta County

76.36%

54

77.48%

67

14

Bath County

77.60%

43

118.81%

28

-14

Bedford County

75.35%

70

87.24%

57

-12

Bland County

75.21%

72

38.08%

112

41

Botetourt County

84.78%

6

132.86%

19

14

Bristol City

76.59%

23

44.94%

104

82

Brunswick County

63.99%

125

17.81%

126

2

Buchanan County

67.65%

111

73.69%

71

-39

Buckingham County

71.86%

94

37.03%

113

20

Buena Vista City

64.64%

122

63.02%

91

-30

Campbell County

75.97%

58

112.60%

30

-27

Caroline County

71.78%

95

36.65%

114

20

Carroll County

76.01%

57

102.29%

39

-17

Charles City County

77.73%

42

95.38%

49

8

Charlotte County

77.04%

48

34.84%

116

69

Charlottesville City

75.57%

64

154.45%

9

-54

Chesapeake City

81.47%

18

114.57%

29

12

Chesterfield County

79.41%

31

82.30%

62

32

Clarke County

77.50%

45

101.54%

52

8

Colonial Beach

64.00%

124

64.95%

85

-38

Colonial Heights City

79.54%

30

171.97%

7

-22

Covington City

71.32%

97

152.31%

10

-86

Craig County

74.41%

78

39.00%

110

33

Culpeper County

75.39%

69

60.11%

92

24

Cumberland County

66.77%

115

69.99%

77

-37

Danville City

65.36%

120

88.81%

54

-65

Dickenson County

74.09%

80

63.43%

89

10

Dinwiddie County

71.94%

93

70.52%

76

-16

Essex County

63.21%

126

49.29%

100

-25

Fairfax County

84.02%

11

127.86%

22

12

Falls Church City

91.10%

1

170.65%

8

8

Fauquier County

79.24%

33

112.23%

31

-1

Floyd County

75.95%

60

45.87%

102

43

Fluvanna County

77.52%

44

65.86%

83

40

Franklin City

59.36%

131

129.24%

21

-109

Franklin County

81.99%

17

64.35%

88

72

Frederick County

75.54%

65

124.29%

23

-41

Fredericksburg City

74.36%

79

134.11%

18

-60

Galax City

74.69%

75

70.74%

75

1

Giles County

76.71%

52

43.03%

108

57

Gloucester County

77.27%

47

98.21%

46

0

Goochland County

83.94%

12

59.77%

93

82

Grayson County

72.39%

90

38.09%

111

22

Greene County

73.62%

83

73.40%

73

-9

Greensville County

67.45%

112

27.61%

121

10

Halifax County

67.78%

110

34.36%

117

8

Hampton City

70.83%

101

88.31%

56

-44

Hanover County

84.36%

7

58.92%

94

88

Harrisonburg City

71.17%

100

102.25%

40

-59

Henrico County

77.94%

39

69.50%

79

41

Henry County

74.00%

81

39.17%

109

29

Highland County

66.24%

117

23.29%

124

8

Hopewell City

67.25%

113

73.16%

74

-38

Isle of Wight County

81.38%

19

68.85%

80

62

King and Queen County

70.30%

103

73.63%

72

-30

King George County

77.88%

40

53.80%

98

59

King William County

78.66%

36

100.45%

44

9

Lancaster County

62.39%

128

77.00%

68

-59

Lee County

74.90%

74

9.93%

130

57

Lexington City

83.32%

14

52.26%

99

86

Loudoun County

86.03%

4

138.33%

14

11

Louisa County

79.26%

32

69.69%

77

46

Lunenburg County

69.29%

108

24.07%

123

16

Lynchburg City

64.39%

123

103.34%

37

-85

Madison County

72.18%

92

136.20%

16

-75

Manassas City

71.38%

96

172.35%

6

-89

Manassas Park City

71.20%

99

102.57%

38

-60

Martinsville City

59.78%

130

111.14%

32

-97

Mathews County

76.74%

51

58.44%

95

45

Mecklenburg County

66.65%

116

29.19%

119

4

Middlesex County

80.71%

23

35.60%

115

93

Montgomery County

78.83%

34

79.77%

66

33

Nelson County

78.53%

37

101.51%

43

7

New Kent County

80.90%

21

81.64%

64

44

Newport News City

66.85%

114

110.30%

33

-80

Norfolk City

65.94%

118

90.52%

52

-65

Northampton County

65.29%

121

31.46%

118

-2

Northumberland County

77.28%

46

56.26%

96

51

Norton City

78.01%

38

47.40%

101

64

Nottoway County

70.15%

105

27.39%

122

18

Orange County

76.35%

56

63.27%

90

35

Page County

73.43%

85

64.59%

87

3

Patrick County

74.44%

77

11.02%

129

53

Petersburg City

57.64%

132

44.37%

107

-24

Pittsylvania County

78.76%

35

22.77%

125

91

Poquoson City

86.21%

3

97.70%

47

45

Portsmouth City

69.40%

107

85.72%

58

-48

Powhatan County

82.01%

16

107.65%

35

20

Prince Edward County

65.42%

119

95.62%

48

-70

Prince George County

79.95%

27

45.02%

103

77

Prince William County

79.81%

29

98.56%

45

17

Pulaski County

74.92%

73

65.25%

84

12

Radford City

75.41%

68

83.96%

60

-7

Rappahannock County

76.85%

50

76.17%

70

21

Richmond City

59.84%

129

90.30%

53

-75

Richmond County

75.93%

61

76.54%

69

9

Roanoke City

71.21%

98

132.74%

20

-77

Roanoke County

85.33%

5

103.62%

36

32

Rockbridge County

75.48%

66

82.02%

63

-2

Rockingham County

79.99%

26

138.87%

13

-12

Russell County

75.58%

63

29.03%

120

58

Salem City

83.35%

13

142.76%

11

-1

Scott County

80.75%

22

13.33%

128

107

Shenandoah County

75.58%

62

84.78%

59

-2

Smyth County

73.86%

82

44.66%

105

24

Southampton County

73.52%

84

68.22%

81

-2

Spotsylvania County

76.35%

55

120.89%

26

-28

Stafford County

81.24%

20

123.97%

24

5

Staunton City

73.13%

87

88.40%

55

-31

Suffolk City

72.75%

89

66.41%

82

-6

Surry County

73.16%

86

136.52%

15

-70

Sussex County

62.89%

127

221.04%

1

-125

Tazewell County

77.87%

41

8.84%

131

91

Virginia Beach City

79.85%

28

120.91%

25

-2

Warren County

75.97%

59

83.74%

61

3

Washington County

80.71%

23

108.98%

34

12

Waynesboro City

70.40%

102

120.38%

27

-74

West Point

90.97%

2

217.53%

2

1

Westmoreland County

69.96%

106

54.40%

97

-8

Williamsburg-James City County

82.58%

15

94.36%

50

36

Winchester City

74.62%

76

134.37%

17

-58

Wise County

84.27%

9

101.67%

41

33

Wythe County

75.25%

71

64.87%

86

16

York County

84.33%

8

80.51%

65

58

           

Accomack County

72.33%

91

No Report

   

Put ‘Em On a Bus to West Point

We have seen that school divisions with higher poverty rates score less well on the SOLs.  For example, looking at the 2015 reading pass rates by division v. the percentage of students classified as economically disadvantaged, we see:

image

We also have seen that spending more money per student does not correlate with higher pass rates.  For instance:

image

Note: The data here are 2014 disbursements (the 2015 data won’t be available until sometime this Spring), with disbursements for facilities, debt service, and contingency reserve removed.

To remove the effect of economic disadvantage, we use the fitted trendline for the first graph to normalize the scores (i.e., express the pass rates as percentages of the trendline rates).  That produces the following graph for the reading tests.

image

It turns out that the pass rates fail to correlate with expenditures per student.

BTW: Richmond is the gold square on the graph; the red diamonds are, from the left, Hampton, Newport News, and Norfolk.  The yellow diamond outperforming divisions are, from the left, Norton, Wise, West Point, and Highland.

Here is the same graph for the math tests:

image

The gold square again is Richmond; the red diamonds again are Hampton, Newport News, and Norfolk.  The yellow diamonds are Wise, West Point, and Highland.

Again, with the average effect of poverty removed, something other than money explains the differences in  performance of the divisions.   Whatever that “something” is, Richmond does not know it, or at least does not practice it.

To me, these data suggest that we should put the Richmond Superintendent and School Board on a bus and send them to Wise and West Point for a week each.  And tell them to stop whining about money.  Whatever the solution to our (HUGE) problem may be, it does not come with a dollar sign.

JLARC Punts

The Joint Legislative Audit and Review Commission (JLARC) has just published its draft report “Efficiency and Effectiveness of K-12 Spending.”  Unfortunately, that report does not even look carefully at where Virginia is spending its educational dollars, much less answer the (much harder) question of what we are getting for that money.

The Mandates

The General Assembly gave JLARC two decently clear mandates.

SJR328 (2013): JLARC shall

study the efficiency and effectiveness of elementary and secondary school spending in Virginia.  [It] shall (i) study the efficiency and effectiveness of elementary and secondary school spending in Virginia, including evaluating the findings from School Efficiency Reviews and assessing the extent to which recommendations have been implemented; (ii) compare to other states how and to what extent Virginia funds elementary and secondary education; and (iii) identify opportunities to improve the quality of education students receive in consideration of the funds spent.

2014 Appropriation Act, Item 30 (at p. 62 of the link): JLARC to examine virtual instruction, to include “effectiveness of  virtual schooling in terms of  student academic achievement outcomes on assessment tests and course completion or graduation rates.”

The “Study”

The result is a 112 page draft that ignores the instructions of the General Assembly. 

Of the nine recommendations, six talk about efficiency; half of the six deal with school busses; only one of the six deals with something that relates to education.  None tells us about the educational effectiveness of our school spending or how to improve it:

  1. Track teacher turnover.
  2. Provide facilities management expertise.
  3. Provide “guidance” regarding sharing information about facilities management best practices.
  4. Consider statewide contract for bus routing and monitoring software.
  5. Provide transportation management expertise.
  6. Assist with transportation management best practices.

As to virtual schooling, JLARC again avoids answering the question.  The three recommendations:

  1. Provide information about online schools.
  2. Estimate costs of online learning.
  3. Compare achievement of virtual v. physical schools

That last one is particularly rich: JLARC is recommending that VDOE do what the General Assembly told JLARC to do.

Cranky’s Conclusion

This study is a wordy waste of money.  It does not answer the questions posed by the General Assembly.  Instead, it raises a new question: Why are we paying JLARC to not do what it’s been told to do?

A Reader’s Conclusion (added on 9/17)

A reader suggests an alternate (and more pertinent) conclusion: Why are we paying JLARC not to do what it’s been told to do, when we already are paying VDOE that should be doing [what JLARC failed to do]?

New (Federal) College Data

USDOE has just posted a considerable trove of college data.

CAVEAT:  These data are mostly for students who received federal financial aid. 

  • “Average Annual Cost”: The average annual net price for federal financial aid recipients, after aid from the school, state, or federal government. For public schools, this is only the average cost for in-state students.
  • “Graduation Rate”: The graduation rate after six years for schools that award predominantly four-year degrees and after four years for all other schools. These rates are only for full-time students enrolled for the first time.
  • “Salary After Attending”: The median earnings of former students who received federal financial aid, at 10 years after entering the school.

My quick reading of the data does not disclose what fraction(s)of the student populations are represented here. 

With that warning, here is a look at the Virginia public and not-for-profit colleges.  First the graduation rates:

image

The winners there are UVa in red, W&L in yellow, and W&M in green.

Next, the median salary ten years out:

image

W&L, in yellow, is the big winner here.

Finally, a bang/buck calculation, ((Salary * Graduation Rate) / Average Cost):

image

Colors, as before, are UVa in red, W&L in yellow.

Here is the dataset, sorted by school name.

image

You might be interested in comparing these data with the results of the Brookings “value-added” study.

No SATisfaction

VDOE has posted the 2015 Virginia average SAT scores.  As of today (9/11/15), RPS has not. 

While I was looking for scores, I found a list of SAT scores for State universities in Virginia.  The site does not date those.  Given that the scores do not change a lot from year to year, I thought it might be interesting to juxtapose the university data with the 2014 Richmond scores.

Here, then, are the 25th and 75th percentile reading scores of students entering our public universities, along with the state and Richmond averages for college-bound students as reported by RPS:

image

Notice that this is an apples-and-oranges comparison.  That said, the state average for college-bound students is close to the 25th percentile scores of entering students at Mason, VMI, and Christopher Newport.  The Richmond average is fifty points below the 25th percentile at Longwood.

And here are the math scores:

image