More Mischief at VDOE

VDOE is running a “survey

After two preliminary questions (nature of your interest in schools; where you live), the survey (reproduced below) asks how important the responder thinks various factors are “for federal accountability purposes.”

There are fourteen questions about matters such as pass rates, academic growth, graduation rate, dropout rate, etc. 

Then, second from the bottom, there is one about the crucial element in school success, the teachers.  But Our Leaders fall back on the old, misleading stupidity of measuring inputs and ignoring outputs: They ask about teacher credentials, not teacher effectiveness.

This fits into an ugly pattern:

It is clear that SOL performance decreases with increasing poverty.  Under the federal prod, VDOE began calculating Student Growth Percentiles (“SGPs”) that were essentially uncorrelated with economic disadvantage. 

The SGPs proved to be embarrassing to the Virginia “Education” Association and VDOE: They allowed a clear measure of teacher effectiveness (see this and this).  So VDOE abandoned ship and introduced “Progress Tables” that roughly measure student progress but mostly ignore the lack of progress.

Now they ask about the importance of teacher credentials (that do not measure teacher effectiveness) but they ignore even their diluted measure of teacher effectiveness.

Guess what: You and I get taxed to pay for this deliberate sidelining of the most important public influence on our schoolchildren, teacher effectiveness.

—————————–

Here is the meat of the survey form:

image

Your Tax Dollars at “Work”

The June, 2016 Post from our Superintendent discusses at some length the results of the current (2017, based on 2016 data) accreditation results.  The VDOE Accreditation page, last updated on Feb. 29, 2016, shows the data from last year but not the current numbers.

Why do you suppose that the Richmond Superintendent had those numbers some time last month but you and I still cannot get them?

Indeed, why do you suppose VDOE could use the SOL data to assess accreditation in June (or before) but cannot use the same data to calculate SGP until Fall [pdf at slide 2]?

Stay tuned while I try to find out. 

In the meantime, consider the possibility that VDOE is, among other things, the State Department of Superintendent Protection far more than the Department of “Education.”

Economic Disadvantage and Richmond’s Awful Middle Schools

We’ve seen that division SOL pass rates fall with increasing economic disadvantage.  Those data also suggest that Richmond’s gross underperformance is not explained by the economic disadvantage of the Richmond students.

Drilling further into the relationship between academic performance and economic disadvantage (ED for short), the reading pass rates of Richmond’s elementary schools show a moderate correlation with ED and the mathematics a weak correlation but our middle and high schools show considerably more robust correlations:

image

image

image

Here are the SOL/ED data:

image

image

image

image

Note: Franklin has both middle and high school grades; I omit it from the graphs because it does not directly compare to either kind of school.

Caveat: Correlation is a necessary but not sufficient condition to infer causation.

The other thing to notice about the middle schools is the very low pass rates.  Here, for reference, are the average pass rates by grade.  The horizontal lines are the reading and math “benchmarks” for accreditation.

image

Why do the middle schools get much lower SOL pass rates with mostly the same kids as the elementary schools?  Let’s infer that the middle schools are doing a much worse job.  See below.

In any case, the R2s imply that the SOL is affected, especially in the middle and high schools, by economic condition or something related to it.

The Student Growth Percentile (SGP) was supposed to remove that correlation so I turned to the latest available data, the 2014 data by school in the 2d download from VDOE in response to Brian Davison’s suit.

There are no high school reading or mathematics data for Richmond in that dataset (EOC Algebra I only) but the elementary and middle school results are compelling. 

image

Here we see our elementary schools performing at bout the 50th percentile on math and a notch lower on reading.  Those performances were mostly uncorrelated with ED (reading R2 of 1%; math, 3%).  The Good News: These learning measures, esp. the reading, are a bit better than the SOL pass rates might suggest.

The school with a reading SGP of 71 (!) is Carver; the 63 is Jones.  As to math, we have six schools above the 60th percentile (Ginter Park at 70; Fisher, 67; Carver, 66; Jones, 65; Munford, 64; and Greene, 62), with Reid in the basement at 32.  That collection of reading STPs just under 40 is not encouraging.

Caveat: These data use the whole school %ED from the Fall census.  The VDOE data would allow calculation for only the SGP grades, 4 & 5, except that their data suppression rules give blank ED values for Munford and Henry by suppressing the fifth grade data (fewer than ten kids reported).  The totals are larger than the sums for the individual grades and presumably include all the ED students so I’ll stick with the (presumably undoctored) total data.

Here are the data:

image

The two very low ED schools are Munford at 10%, performing well above the 50th percentile, and Fox at 22% ED scoring at the 50th percentile in reading but only the 44th in math.  This makes it look like those nice SOLs at Fox are the result of smart kids who are scoring well but not improving as much as the smart kids in other schools.

The 24th percentile score in math is Reid.

The conclusion: On the 2014 data, our elementary schools are doing an average job, on average.  There’s work to be done at Reid and some others but, all in all, the SGPs report more learning than the SOLs might suggest.

And how much the kids learned was generally unrelated to economic disadvantage.

The middle schools were an unhappier story:

image

image

The database let me pull the 6th, 7th, and 8th grade data so I’ve included Franklin.

Note the low average performance and the modest correlation of the math scores.  Also notice the absence of schools with low ED populations.

As to that last point, these data raise the question whether those low ED kids from Munford and Fox have dropped out or gone to the Counties or to private schools for middle school or whether their numbers just disappear into the average.

To that issue here, first, are the totals:

image

And here are the details:

image

Or, relative to the 9th grade memberships:

image

VDOE publishes no data on kids who drop out before entering middle school  The data they do share indicate zero dropouts from grades 7 or 8 in 2014.  That seems unlikely but it’s all the information we have.

We are left with the possibility that the middle school drop in membership and rise in %ED reflects some of the more affluent kids fleeing to private schools and to the Counties.  The precipitous drops in both total and ED membership after the 9th grade surely come from dropouts.

But to revisit the major point: The low correlations with ED tell us that the low middle school SGPs can’t be caused by the increased economic disadvantage; the leading candidates for those lousy SGPs, then, are lousy teaching and/or lousy administrators who fail to control the middle schools.

The other point here: The State Department of Data Suppression has stopped calculating SGPs, which leaves us with the manifestly flawed SOL data to assess school (and teacher) quality.  It seems we’ll have to wait until late summer to see whether they are going to release or suppress their new progress (aka “value”) tables that measure academic progress (but mostly ignore the lack of it).

The Awfulness of Richmond’s Middle Schools

While pulling data for an upcoming post on our middle schools and the SGP, I downloaded these SOL pass rates:

image

The horizontal lines are the “benchmarks” for accreditation.

That provoked graphs of the latest (and last) SGP data (the second of the three VDOE downloads) that show Richmond’s 2014 middle school average reading and math SGP percentiles along with those of the other divisions:

image

image

Richmond is the gold bar on each graph.  The red bars are the peer divisions: from the left Norfolk, Newport News, and Hampton.

Please recall that the SGP does not correlate with economic disadvantage so Richmond can’t blame the kids for this dismal showing.

Here are the data:

image image

 

The Devil Made Me Do It

After I had posted the data above, it was raining and there was no going outside.  So I pulled the Richmond SOL data by grade and by year.  Here they are:

image

image

The only pattern that emerges here, beyond the terrible middle school numbers, is the score drops that came with the new math tests in 2012 and the new reading tests in 2013.  Those, of course, were associated with the former Superintendent’s failure to align the Richmond curricula with the new tests.

The Empire Strikes Back. Feebly.

On June 2, Donald Wilms, President of the Chesterfield Education Association, responded in the Times-Dispatch to Bart Hinkle’s editorial of May 28.

Hinkle had made the point that the Virginia Education Association’s attempt to suppress truthful data on teacher effectiveness sought to keep “parents and taxpayers . . . in the dark about which teachers are doing a great job – and which ones aren’t.” 

Wilms sought to argue that the data should be kept secret.  He mostly demonstrated that the Chesterfield Education Association needs a better argument.

Wilms brought on some emotional arm-waving about the students who may come to school on test day after oversleeping and missing breakfast; after a fight; after losing a boy-/girlfriend; or the like.  He neglected to mention that the data he disdains are based on two (or more) successive years’ test scores:  An outside event in the second year could lower a student’s measured progress but the same event in the first year could increase the score difference and, thus, the progress measure.

The nut of Wilms’ argument, however, was the kids who are at a systematic disadvantage:

[W]ould it be fair for schools with high-income households — where both parents are college-educated, where kids go to museums and take exotic vacations, where parents have two cars and are available to take kids to the public library — to compete with schools where kids live in single-parent households, where parents hold several low-wage jobs, with hours preventing them from being home to take kids to museums or public libraries, and incomes preventing them from any vacation at all? Heck, it would be unfair to compare a school in western Chesterfield with one in eastern Chesterfield, let alone to compare one of Chesterfield’s (or Henrico’s) wealthiest school communities with one of Richmond’s neediest school communities, don’t you think?

* * *

What these [SGP data] really illustrate is which teachers have the at-risk kids and which don’t.

On these emotional grounds, Wilms attacked the fairness of the “flawed rankings” of the Student Growth Percentile, the “SGP” (and, apparently, other measures of teaching effectiveness).  He disdained to discuss the actual data, which are to the contrary.

When it was considering adopting the SGP, VDOE provided data showing that test scores fall with increasing economic disadvantage (no surprise there) but that SGP percentiles do not:

image

That is because the SGP, by design, compares low-performers only to other low-performers.  Unlike the SOL, it measures progress relative to others similarly situated.

Indeed, the data for Chesterfield County make a clear case that students who score poorly one year, for whatever reason, can show outstanding academic growth the next year.  Here, for instance, are the 2014 math SGP scores of Chesterfield students plotted against those same students’ 2013 SOL scores.

image

Before drawing any conclusions, we need to refine the analysis slightly: Students who score pass, advanced (SOL 500 or above) for two years running do not receive an SGP (it’s hard to improve on “excellent”).  In a sense, the SGP penalizes teachers and schools with large numbers of very bright students!

The SOL scores ≥ 500 in the graph above represent students who scored advanced 2013 but less than advanced in 2014.  The students who scored advanced in both ‘13 and ‘14 do not appear in this dataset at all, which biases the analysis.  So let’s look only at the students with 2013 SOLs < 500:

image

The result doesn’t change much.  The R2 value of 0.14% tells us that the SGPs are quite uncorrelated with the previous year’s SOL scores.  Of course, correlation is necessary but not sufficient to show causation.  Said otherwise: Chesterfield students with low SOL scores in the previous year, for whatever reason, show superior (or inferior) academic growth (as measured by the SGP) in the current year just as often as students who scored well in the previous year.

We can’t directly test Wilms’ statement about pitting a rich school against a poor one because VDOE (known here as the Virginia Department of Data Suppression) hasn’t released (and won’t release) the data.  So let’s go one better: Let’s use the data we have to compare some of the worst students in Chesterfield in terms of 2013 SOL with some of the best.

Using the math data, here is the first group, all at 50 points or more below the passing score of 400:

image

As before, the 2014 SGP does not correlate with the 2013 SOL. 

Next the 2013 SOLs between 450 (50 points above the passing score) and 499 (where we stop to avoid the advanced student penalty).

image

Pretty much the same story: High or low SOL one year does not predict SGP growth in the next year.

Said yet otherwise: Part of teaching in the public schools is dealing with kids who are disadvantaged; the SGP identifies teachers who do that well; the SOL may not.

BTW: The reading data tell the same story with slightly different numbers.  Let’s avoid some clutter and leave them out. 

It’s easy to understand why Wilms would prefer the current system.  In 2011 (the only year that VDOE slipped up and posted evaluation data), Chesterfield teachers were evaluated on seven criteria.  Six of those were inputs, i.e., relatively easy to measure but only distantly related to student learning.  On the only important measure, “Student Achievement and Academic Progress,” only twelve of 1222 Chesterfield teachers (1%) were said to need improvement and none was unsatisfactory. 

image

But when we look at real performance data (2012 SGPs are the earliest that Brian Davison sued out of VDOE), we see student progress that was much worse than “unsatisfactory.”  For example here is the math performance of Teacher No. 34785 (identifier anonymized by VDOE):

image

The yellow points are the annual SGP averages of that teacher’s math students.  The blue bars show the 95% confidence intervals. 

The Chesterfield math averages those years were 48.7, 49.2, and 48.2.

Then we have Nos. 54317 and 86898:

image  image

To put some context on these numbers, here is the 2014 statewide distribution of math SGP averages by teacher.

image

The mean was 49.3; the estimated standard deviation, 16.3.  That is, about sixteen percent of the teachers were below 33.0; another sixteen percent, above 65.6.  Two and a half percent were below 16.7; another 2.5%, above 81.9. I’ve marked those values on the 2014 Chesterfield distribution:

image

The problem here is not “flawed rankings.”  It is flawed teachers that the Chesterfield “Education” Association does not want the parents of Chesterfield County to know about.

BTW: The data also show that there are some really fine teachers in Chesterfield.  For example, in math we see Nos. 29893, 28974, and 81816:

image image image

The Chesterfield and Virginia “Education” Associations don’t want you to know about these outstanding teachers, either.  They just want you to think that 99% are above average.

And VDOE is a co-conspirator.  Your tax dollars “at work.”

Yet More of What VEA and VDOE Are Trying to Hide

In the discussion of the SGP data that Brian Davison sued out of the State Department of Data Suppression, I’ve been focused on the awful teaching (e.g., here, here, and here) that the Virginia Department of “Education” and the Virginia “Education” Association have been attempting to conceal.  But their efforts to hide teacher performance data from the taxpayers who are paying the teachers have another outrageous effect: They suppress the identities of the many great teachers in Virginia’s public schools.

Having looked at the 43 teachers with the worst three-year average math SGPs, let’s turn to the 43 with the best averages:

image

The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three year average for each teacher.  The “Grand Total” row reports the statewide average of each column.

All but three of the 43 teachers at the bottom of the SGP list damaged Virginia schoolchildren for only one year; almost half of the teachers in the present list helped educate Virginia schoolchildren for more than one year.

Here are the Top Ten who taught all three of the years for which we have data.

image

Notice that all but two of these got better performances from their students in 2014 than in 2012.  And, of those two, No. 115415’s even 99 in 2012 and much lower average suggest only one student (of doubtful statistical significance) in the first year and substantial progress over the second two years.

The preponderance of NoVa suburbs in this list raises the question whether those divisions have better students, or better teachers, or some combination.  See below for some data suggesting that there is more learning, and presumably more teaching, is in some more rural districts.  In the meantime, here are data for the Top Ten in graphical form.

image

Or, with the ordinate expanded:

image

The division averages provide some further insights.  Let’s start with the distribution of division mathematics averages for 2014.

image

As shown above, the mean math SGP in 2014 was 49.1.  The division mean was 48.3, with a standard deviation of 6.2.

The Top Ten divisions of 2014 did not include any of the NoVa suburbs.

image

Neither, for that matter, did the bottom ten.

image

Here is the entire 2014 math list, sorted by division (Colonial Beach, Craig, and Surry had no data).

wp352i5z

When we turn to the division averages by year an interesting pattern emerges: The Top Ten in 2014 all improved from 2012.

image

image

Seems to me that a whole bunch of educators should be put on a bus to one of these divisions to find out how to increase student performance in math.

The Bottom Ten in 2014 mostly declined from 2012, except for Franklin and West Point, which improved; King and Queen, which improved slightly; and Galax, which only had data for 2014.

image

image

But the Virginia Department of “Education” and the Virginia “Education” Association don’t want you to see these data.  Apparently you are not qualified to know what you are getting (or not getting) for your tax dollar.

Still More of What VEA Wants to Hide

The SGP data that Brian Davison sued out of the State Department of Data Suppression last year showed (e.g., here and here) that we have some truly awful teachers in Virginia.  The lawsuit threatened by the Virginia “Education” Association demonstrates that it is more interested in protecting those bad teachers than in the effect of those teachers on Virginia’s schoolchildren.  And the VDOE’s own document, confirmed by the available data, admits that the teacher evaluation process has been worse than ineffective.

I’ve turned to the math data to provide some details.

A list of the very worst three-year teacher average math SGPs offers some Good News and some very Bad News. 

  • The Bad: Some truly awful teachers have been inflicted on Virginia’s schoolchildren, sometimes more than once. 
  • The Good: Most of those teachers from 2012 and 2013 were not there the following year.

Here are the worst 43 three-year math SGP averages by teacher, sorted by increasing average.

image

The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three-year average SGP for each teacher.  The “Grand Total” row is the statewide average for each year and for the teacher averages.

We can’t tell from these data whether any of the teachers with only 2014 results stayed on to harm schoolchildren in 2015.  The yellow highlights identify the two teachers who came back after a failure to teach in an earlier year.

The red highlight indicates the teacher with the lowest average SGP among those who “taught” all three years.

Turning the cases where the school division was so evil (or so intimidated by the VEA) that it subjected its students to more than two years’ exposure to the same awful teaching, here are the Top Ten, sorted by increasing three-year average:

image

No. 90763 is an outlier here: The three year average of 17.1 comes from yearly averages of 71.0, 68.0, and 15.6.  The “.0s” are clues: This teacher likely had only one reported SGP for each of the first two years and a much larger class in 2014.  Indeed, the database shows 72 math SGPs in 2014.  If we add the 15.625 average that year times 72 to the 71 and 68, and divide by 74 we get 17.08, the three year average for this teacher.

That said, no telling what else unusual was going on there.

Turning to the other nine cases, we see:

image

Or, upon expanding the ordinate:

image

Only two of the nine showed improvement over the three-year period.  No. 88959 went from a 9.6 to a 28.3, which is still more than twenty percentiles below average.  No. 114261 improved, but only to 17.6.  We have no data regarding the guinea pigs students who were damaged in these experiments.

The other seven “teachers” all showed deteriorated performance over the period.  Note especially No. 54317, who went from bad to appalling (and was paid tax dollars to continue afflicting Chesterfield County schoolchildren).

There are as many as nine principals (depending on time frame in the job and whether it was one or two schools in Chesterfield) who should have been fired over these disasters.  I’ll bet you a #2 lead pencil they all got raises.

(And, for those of us in Richmond who look to the schools in the Counties as the Elysian Fields when compared to the barren wasteland of Richmond Public Schools, the presence of one Henrico and two Chesterfield teachers in this list comes as a shock.)

But, the Virginia “Education” Association and the State Department of Data Suppression don’t want you to know about this, especially if your kid is suffering under such an awful “teacher” and such a pusillanimous school system.

Your tax dollars at “work.”

More of What VEA Wants to Hide

Turning again to the SGP data that Brian Davison sued out of VDOE last year, let’s look at mathematics.

No. 40837, the teacher with the best 2014 math SGP average in Virginia, 95.2, had a fifth grade class in Fairfax.  Here are the data:

image

We can resist the temptation to dismiss this as the result of a class full of very bright students:  Students who scored in the advanced “proficient” range two years running didn’t get an SGP (it’s hard to show growth from really-good-already).  Thus, the students reported here did not obtain superior scores in both 2013 and 2014; they improved radically in 2014, compared to the others in Virginia who had similar performances in 2013.

Of further interest, this teacher’s reading SGPs (average of 65.7) are above average (48.0) but much less spectacular:

image

We can think of all kinds of explanations for this pattern.  In the absence of other data, Friar Occam would tell us to look first at the simple one: This is a superior math teacher and an above average reading teacher.

At the other end of the spectrum, we have No. 76323 whose fifth grade math students in Richmond averaged an SGP of 4.0.

image

The Virginia “Education” Association has threatened to sue because it doesn’t want you to know about this teacher.  But you can bet that the Richmond School Board members, Superintendent, principals and teachers knew and that none of their kids was among the unfortunate 25 in No. 76323’s class.

The reading performance of this teacher was a world better, with an above-average mean of 57.7

image

This suggests that there is a place for this teacher in Richmond, just not teaching math (absent some heavy-duty retraining).

The second worst 2014 math performance comes from fourth grade teacher No. 71819 in Lynchburg, with an average of 4.4.

image

This same teacher turned in a 25.7 in reading.

image

So, one awful performance and one clearly sub-par. 

Unfortunately, this teacher was getting worse:

image

You can again bet that no child of a School Board member, the Superintendent, or any Lynchburg principal or teacher was damaged by this teacher.

All the schoolchildren of Lynchburg (and their parents who pay the teachers) deserve to be protected from this teacher, and the too many others who are nearly as bad.  Another twenty-nine Lynchburg teachers averaged a math SGP of less than thirty in 2014; eight of those were less than twenty.

image

The 2014 reading performance in Lynchburg was less horrible: One teacher was below an average SGP of twenty, another six were between twenty and thirty. 

The SGP data from 2012 to 2014 tell us that Lynchburg’s average performance was sub-par and deteriorating.

image

Unfortunately, we can’t rely on the schools to deal with the ineffective teachers.  VDOE quotes a 2009 study for the proposition that

99 percent of teachers were rated as satisfactory when their schools used a satisfactory/unsatisfactory rating system; in schools that used an evaluation scale with a broader range of options, an overwhelming 94 percent of all teachers received one of the top two ratings.

The only Virginia data we have on performance evaluations for teachers are from 2011.  That year 99.3% of the Lynchburg teachers were rated “proficient”; 0.1% (one of 731) were “unacceptable – recommend plan of assistance”; 0.5% (four of 731) were “unacceptable – recommend non-renewal or dismissal.”  Looks like Lynchburg, like Richmond, thinks it is the Lake Woebegone of teachers.

Indeed, it looks like there was little thinking involved and the evaluation process was a bad joke.  I have a Freedom of Information Act request pending at VDOE to see whether their new process is any better (or whether they are part of the VEA conspiracy of secrecy).

The Virginia “Education” Association is threatening to sue VDOE, Brian Davison, and me because they don’t want you to know whether your kid is being subjected to a lousy teacher.  Their behavior demonstrates that their mission is protecting incompetent teachers, not advancing “education.”  That makes the very name of the Association an exercise in mendacity. 

As to Richmond (and, doubtless, Lynchburg) I said earlier:

Seems to me City Council should demand, as a condition of all the money they are spending on [the schools], an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of Principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).

Look What VEA Wants to Hide in Richmond

It’s a strange state we live in.

The meetings of our legislators are open to the public; their work product goes in the newspaper and on the Internet. The public is free to evaluate their positions, express opinions, and hold them accountable by voting them in or out of office.

Virginia’s judges perform in open court. Their work product is public and subject to review by the appellate courts. Judicial Performance Evaluations based on feedback from attorneys and jurors go to the General Assembly, which has the power to fire judges, and to the public, which can fire members of the General Assembly.

In contrast, the evaluations of how much the students of any teacher in our public schools have learned (or not) are confidential.  The Virginia “Education” Association says that the public is too stupid (or biased or something) to properly evaluate those data.  The evaluation is left to the school systems, who are free to ignore bad teaching, and do so with gusto.  So the parents of Virginia are left without the information to evaluate their children’s teachers or to oversee the school divisions’ management of the inadequate teachers.

Brian Davison of Loudoun sued the Department of Education and punched a small hole in this conspiracy against Virginia’s schoolchildren.  So, now, the VEA has threatened to sue VDOE, Brian, and me, seeking court orders to prevent, among other things, Brian’s and my disseminating and commenting upon SGP and, perhaps, other data regarding teacher effectiveness (or lack thereof).

At the outset, this demonstrates that the VEA is too stupid to count to “one”: The First Amendment bars this attempted prior restraint of Brian’s and my truthful speech.  (Could it be that the manifest insecurity of the VEA’s lawyer stems from a recognition, however faint, of that stupidity?)

As well, the information already available provides a window into what VEA is trying to hide. 

For three or four years, VDOE calculated Student Growth Percentiles (“SGPs”).  They calculate the SGP by looking at student progress compared to other students who were similarly situated in the previous year(s).  The score change of each student in the group is then reported as a percentile rank from 1 (worst 1% of the group) to 99 (best 1%).

The 2014 statewide distribution of average reading SGPs by teacher approaches the ideal normal distribution.

image

The orange curve fitted to the data shows an average of 48.0 with a standard deviation of 9.7

The Richmond distribution that year leans toward the low end (no surprise there).

image

The fitted curve has a mean of 44.0 and a standard deviation of 11.4.

Indeed, we know that the actual data are worse: Richmond failed to report a bunch of its (awful) middle school data.  VDOE did nothing about that, of course.

The distribution of individual student reading SGPs in Richmond, again for 2014, also leans toward the low end. 

image

Since we know that students who have shown more progress than their peers get higher SGP scores, this is not good news for Richmond. 

Let’s turn to some specifics.  First some Good News.

The (fifth grade) teacher No. 74414 (anonymized identifier from VDOE) whose students averaged a 78 SGP shows a much different distribution.

image

That teacher got even more splendid results in math (average = 93).

image

We could hope this teacher would be in line for a big raise and a task to mentor other teachers.

And we have to wonder why the VEA would want to hide this teacher’s name.

Then we have a large number of teachers near the middle of the pack.  For example, here is No. 76273 with SGPs for 21 fifth grade reading students and a 48 average.

image

This same teacher did much better in math, with an 81 average.

image

This is a fine math teacher who might benefit from some work on his/her (average but lesser) skills for the teaching of reading.

The VEA says the adequacy of this teacher should be concealed from the parents of the students in his/her classroom because the information “can be used or misused to make prejudicial judgments about teacher performance.”

Then we have the teachers who are actively harming their students.  As one example, here is Richmond teacher No. 74415, with 25 fourth grade students averaging a reading SGP of 8:

image

Then we also have No. 75318, averaging 8 for 22 fourth grade reading students:

image

The parents of the affected students are not allowed to know who these teachers are.  Indeed, the Virginia “Education” Association would prohibit even my revealing that these teachers exist.

OFFER: I’ll bet you a #2 lead pencil that no child of an RPS teacher, principal, administrator, or School Board member was or will be in 74415’s or 75318’s class.  (But, of course, you are not important enough to have the information to avoid that hazard to your kid.)

Without information for the public to oversee the schools, we know nothing will be done about these and other ineffective teachers:  The assessment system is so pitiful that in 2011 Richmond teachers met or exceeded expectations in 99.28% of the measurements.

Yet VEA says, in effect, “Damn the students!  These teachers might be embarrassed if the parents knew enough to demand their retraining or replacement.”

On its Web site, VEA says:

The mission of the Virginia Education Association is to unite our members and local communities across the Commonwealth in fulfilling the promise of a high quality public education that successfully prepares every single student to realize his or her full potential. We believe this can be accomplished by advocating for students, education professionals, and support professionals.

As to the students who are suffering under inept VEA members and as to the whole notion of “high quality public education,” the threatened VEA suit confesses that this “mission” statement is a shameless lie.  Indeed, the honest name for the organization would be “Virginia Association for the Protection of Incompetent Teachers.”

The Insecure Defending the Unspeakable

It looks like I’m being sued.

Yesterday I received by email a “Petition for Injunction” for VEA and others against VDOE, Brian Davison, and me.  The Petition asks the Richmond Circuit Court to enjoin VDOE from releasing SGP and related data and to prohibit Brian and me from using such data.

Well, as to the name of the court I’m being unduly kind: The Petition is addressed to “The Circuit Court for the City of Richmond” (emphasis supplied).  There is no such court.  By statute, our circuit court is “The Circuit Court of the City of Richmond” (emphasis supplied again).  So, the Petition, if it is genuine, demonstrates at the top of the first page the ignorance of the VEA’s lawyer.

The body of the Petition, entirely aside from any legal merit (actually, lack thereof), illuminates the unfortunate truth that VEA is more interested in protecting incompetent teachers than in furthering the educations of Virginia’s schoolchildren.

Then, at the bottom, the Petition carries a signature block that begins:

Dena Rosenkrantz, Esquire (VSB#28667)
Legal Services Director
VIRGINIA EDUCATION ASSOCIATION

Of course, “Esquire” is a courtesy title, mostly applied to lawyers.  So we have Ms. Rosenkrantz stroking herself with a courtesy title.  As well, the “Esquire” is redundant: The Virginia State Bar number, required by Supreme Court Rule 1:4(l), tells us she is a lawyer.

It’s hard to imagine a lawyer with an ego so shrunken that she feels a need to be courteous to herself and to tell us she is a lawyer lawyer.  But it seems the VEA has found one.

Indeed, it looks like there’s an epidemic of insecurity over there.  The signature block of the purported law clerk who sent the email starts: “Catherine A. Lee, JD.” Really!  A law clerk who feels the need to tell us she has a law degree.  Remarkably, she didn’t attach a law school transcript to show how smart she is or a picture to show how pretty.

Let’s hope this Petition is not a prank and that it will be filed at the courthouse and served on Brian and me.  Dealing with a lawyer at that level of ignorance and with that defect of ego, who is attempting to keep Virginia’s parents from knowing whether their kid is being damaged by an incompetent teacher, should be good fun.

Moreover, VDOE’s lawyer is a competent and affable fellow.  To the extent he is on Brian’s and my side, or even to the extent he isn’t, those qualities will enhance the enjoyment.

——————————————————-

P.S.: I have created a new email account for this litigation: 4students_unlikeVEA@outlook.com.  If you know where and what the Richmond plaintiff, Bradley Mock, teaches or whether he is the same “Bradley Mock” who studied “Hip Hop Culture” at VCU, please do use it to send me an email.