Yet More of What VEA and VDOE Are Trying to Hide

In the discussion of the SGP data that Brian Davison sued out of the State Department of Data Suppression, I’ve been focused on the awful teaching (e.g., here, here, and here) that the Virginia Department of “Education” and the Virginia “Education” Association have been attempting to conceal.  But their efforts to hide teacher performance data from the taxpayers who are paying the teachers have another outrageous effect: They suppress the identities of the many great teachers in Virginia’s public schools.

Having looked at the 43 teachers with the worst three-year average math SGPs, let’s turn to the 43 with the best averages:

image

The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three year average for each teacher.  The “Grand Total” row reports the statewide average of each column.

All but three of the 43 teachers at the bottom of the SGP list damaged Virginia schoolchildren for only one year; almost half of the teachers in the present list helped educate Virginia schoolchildren for more than one year.

Here are the Top Ten who taught all three of the years for which we have data.

image

Notice that all but two of these got better performances from their students in 2014 than in 2012.  And, of those two, No. 115415’s even 99 in 2012 and much lower average suggest only one student (of doubtful statistical significance) in the first year and substantial progress over the second two years.

The preponderance of NoVa suburbs in this list raises the question whether those divisions have better students, or better teachers, or some combination.  See below for some data suggesting that there is more learning, and presumably more teaching, is in some more rural districts.  In the meantime, here are data for the Top Ten in graphical form.

image

Or, with the ordinate expanded:

image

The division averages provide some further insights.  Let’s start with the distribution of division mathematics averages for 2014.

image

As shown above, the mean math SGP in 2014 was 49.1.  The division mean was 48.3, with a standard deviation of 6.2.

The Top Ten divisions of 2014 did not include any of the NoVa suburbs.

image

Neither, for that matter, did the bottom ten.

image

Here is the entire 2014 math list, sorted by division (Colonial Beach, Craig, and Surry had no data).

wp352i5z

When we turn to the division averages by year an interesting pattern emerges: The Top Ten in 2014 all improved from 2012.

image

image

Seems to me that a whole bunch of educators should be put on a bus to one of these divisions to find out how to increase student performance in math.

The Bottom Ten in 2014 mostly declined from 2012, except for Franklin and West Point, which improved; King and Queen, which improved slightly; and Galax, which only had data for 2014.

image

image

But the Virginia Department of “Education” and the Virginia “Education” Association don’t want you to see these data.  Apparently you are not qualified to know what you are getting (or not getting) for your tax dollar.

Still More of What VEA Wants to Hide

The SGP data that Brian Davison sued out of the State Department of Data Suppression last year showed (e.g., here and here) that we have some truly awful teachers in Virginia.  The lawsuit threatened by the Virginia “Education” Association demonstrates that it is more interested in protecting those bad teachers than in the effect of those teachers on Virginia’s schoolchildren.  And the VDOE’s own document, confirmed by the available data, admits that the teacher evaluation process has been worse than ineffective.

I’ve turned to the math data to provide some details.

A list of the very worst three-year teacher average math SGPs offers some Good News and some very Bad News. 

  • The Bad: Some truly awful teachers have been inflicted on Virginia’s schoolchildren, sometimes more than once. 
  • The Good: Most of those teachers from 2012 and 2013 were not there the following year.

Here are the worst 43 three-year math SGP averages by teacher, sorted by increasing average.

image

The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three-year average SGP for each teacher.  The “Grand Total” row is the statewide average for each year and for the teacher averages.

We can’t tell from these data whether any of the teachers with only 2014 results stayed on to harm schoolchildren in 2015.  The yellow highlights identify the two teachers who came back after a failure to teach in an earlier year.

The red highlight indicates the teacher with the lowest average SGP among those who “taught” all three years.

Turning the cases where the school division was so evil (or so intimidated by the VEA) that it subjected its students to more than two years’ exposure to the same awful teaching, here are the Top Ten, sorted by increasing three-year average:

image

No. 90763 is an outlier here: The three year average of 17.1 comes from yearly averages of 71.0, 68.0, and 15.6.  The “.0s” are clues: This teacher likely had only one reported SGP for each of the first two years and a much larger class in 2014.  Indeed, the database shows 72 math SGPs in 2014.  If we add the 15.625 average that year times 72 to the 71 and 68, and divide by 74 we get 17.08, the three year average for this teacher.

That said, no telling what else unusual was going on there.

Turning to the other nine cases, we see:

image

Or, upon expanding the ordinate:

image

Only two of the nine showed improvement over the three-year period.  No. 88959 went from a 9.6 to a 28.3, which is still more than twenty percentiles below average.  No. 114261 improved, but only to 17.6.  We have no data regarding the guinea pigs students who were damaged in these experiments.

The other seven “teachers” all showed deteriorated performance over the period.  Note especially No. 54317, who went from bad to appalling (and was paid tax dollars to continue afflicting Chesterfield County schoolchildren).

There are as many as nine principals (depending on time frame in the job and whether it was one or two schools in Chesterfield) who should have been fired over these disasters.  I’ll bet you a #2 lead pencil they all got raises.

(And, for those of us in Richmond who look to the schools in the Counties as the Elysian Fields when compared to the barren wasteland of Richmond Public Schools, the presence of one Henrico and two Chesterfield teachers in this list comes as a shock.)

But, the Virginia “Education” Association and the State Department of Data Suppression don’t want you to know about this, especially if your kid is suffering under such an awful “teacher” and such a pusillanimous school system.

Your tax dollars at “work.”

More of What VEA Wants to Hide

Turning again to the SGP data that Brian Davison sued out of VDOE last year, let’s look at mathematics.

No. 40837, the teacher with the best 2014 math SGP average in Virginia, 95.2, had a fifth grade class in Fairfax.  Here are the data:

image

We can resist the temptation to dismiss this as the result of a class full of very bright students:  Students who scored in the advanced “proficient” range two years running didn’t get an SGP (it’s hard to show growth from really-good-already).  Thus, the students reported here did not obtain superior scores in both 2013 and 2014; they improved radically in 2014, compared to the others in Virginia who had similar performances in 2013.

Of further interest, this teacher’s reading SGPs (average of 65.7) are above average (48.0) but much less spectacular:

image

We can think of all kinds of explanations for this pattern.  In the absence of other data, Friar Occam would tell us to look first at the simple one: This is a superior math teacher and an above average reading teacher.

At the other end of the spectrum, we have No. 76323 whose fifth grade math students in Richmond averaged an SGP of 4.0.

image

The Virginia “Education” Association has threatened to sue because it doesn’t want you to know about this teacher.  But you can bet that the Richmond School Board members, Superintendent, principals and teachers knew and that none of their kids was among the unfortunate 25 in No. 76323’s class.

The reading performance of this teacher was a world better, with an above-average mean of 57.7

image

This suggests that there is a place for this teacher in Richmond, just not teaching math (absent some heavy-duty retraining).

The second worst 2014 math performance comes from fourth grade teacher No. 71819 in Lynchburg, with an average of 4.4.

image

This same teacher turned in a 25.7 in reading.

image

So, one awful performance and one clearly sub-par. 

Unfortunately, this teacher was getting worse:

image

You can again bet that no child of a School Board member, the Superintendent, or any Lynchburg principal or teacher was damaged by this teacher.

All the schoolchildren of Lynchburg (and their parents who pay the teachers) deserve to be protected from this teacher, and the too many others who are nearly as bad.  Another twenty-nine Lynchburg teachers averaged a math SGP of less than thirty in 2014; eight of those were less than twenty.

image

The 2014 reading performance in Lynchburg was less horrible: One teacher was below an average SGP of twenty, another six were between twenty and thirty. 

The SGP data from 2012 to 2014 tell us that Lynchburg’s average performance was sub-par and deteriorating.

image

Unfortunately, we can’t rely on the schools to deal with the ineffective teachers.  VDOE quotes a 2009 study for the proposition that

99 percent of teachers were rated as satisfactory when their schools used a satisfactory/unsatisfactory rating system; in schools that used an evaluation scale with a broader range of options, an overwhelming 94 percent of all teachers received one of the top two ratings.

The only Virginia data we have on performance evaluations for teachers are from 2011.  That year 99.3% of the Lynchburg teachers were rated “proficient”; 0.1% (one of 731) were “unacceptable – recommend plan of assistance”; 0.5% (four of 731) were “unacceptable – recommend non-renewal or dismissal.”  Looks like Lynchburg, like Richmond, thinks it is the Lake Woebegone of teachers.

Indeed, it looks like there was little thinking involved and the evaluation process was a bad joke.  I have a Freedom of Information Act request pending at VDOE to see whether their new process is any better (or whether they are part of the VEA conspiracy of secrecy).

The Virginia “Education” Association is threatening to sue VDOE, Brian Davison, and me because they don’t want you to know whether your kid is being subjected to a lousy teacher.  Their behavior demonstrates that their mission is protecting incompetent teachers, not advancing “education.”  That makes the very name of the Association an exercise in mendacity. 

As to Richmond (and, doubtless, Lynchburg) I said earlier:

Seems to me City Council should demand, as a condition of all the money they are spending on [the schools], an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of Principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).

Look What VEA Wants to Hide in Richmond

It’s a strange state we live in.

The meetings of our legislators are open to the public; their work product goes in the newspaper and on the Internet. The public is free to evaluate their positions, express opinions, and hold them accountable by voting them in or out of office.

Virginia’s judges perform in open court. Their work product is public and subject to review by the appellate courts. Judicial Performance Evaluations based on feedback from attorneys and jurors go to the General Assembly, which has the power to fire judges, and to the public, which can fire members of the General Assembly.

In contrast, the evaluations of how much the students of any teacher in our public schools have learned (or not) are confidential.  The Virginia “Education” Association says that the public is too stupid (or biased or something) to properly evaluate those data.  The evaluation is left to the school systems, who are free to ignore bad teaching, and do so with gusto.  So the parents of Virginia are left without the information to evaluate their children’s teachers or to oversee the school divisions’ management of the inadequate teachers.

Brian Davison of Loudoun sued the Department of Education and punched a small hole in this conspiracy against Virginia’s schoolchildren.  So, now, the VEA has threatened to sue VDOE, Brian, and me, seeking court orders to prevent, among other things, Brian’s and my disseminating and commenting upon SGP and, perhaps, other data regarding teacher effectiveness (or lack thereof).

At the outset, this demonstrates that the VEA is too stupid to count to “one”: The First Amendment bars this attempted prior restraint of Brian’s and my truthful speech.  (Could it be that the manifest insecurity of the VEA’s lawyer stems from a recognition, however faint, of that stupidity?)

As well, the information already available provides a window into what VEA is trying to hide. 

For three or four years, VDOE calculated Student Growth Percentiles (“SGPs”).  They calculate the SGP by looking at student progress compared to other students who were similarly situated in the previous year(s).  The score change of each student in the group is then reported as a percentile rank from 1 (worst 1% of the group) to 99 (best 1%).

The 2014 statewide distribution of average reading SGPs by teacher approaches the ideal normal distribution.

image

The orange curve fitted to the data shows an average of 48.0 with a standard deviation of 9.7

The Richmond distribution that year leans toward the low end (no surprise there).

image

The fitted curve has a mean of 44.0 and a standard deviation of 11.4.

Indeed, we know that the actual data are worse: Richmond failed to report a bunch of its (awful) middle school data.  VDOE did nothing about that, of course.

The distribution of individual student reading SGPs in Richmond, again for 2014, also leans toward the low end. 

image

Since we know that students who have shown more progress than their peers get higher SGP scores, this is not good news for Richmond. 

Let’s turn to some specifics.  First some Good News.

The (fifth grade) teacher No. 74414 (anonymized identifier from VDOE) whose students averaged a 78 SGP shows a much different distribution.

image

That teacher got even more splendid results in math (average = 93).

image

We could hope this teacher would be in line for a big raise and a task to mentor other teachers.

And we have to wonder why the VEA would want to hide this teacher’s name.

Then we have a large number of teachers near the middle of the pack.  For example, here is No. 76273 with SGPs for 21 fifth grade reading students and a 48 average.

image

This same teacher did much better in math, with an 81 average.

image

This is a fine math teacher who might benefit from some work on his/her (average but lesser) skills for the teaching of reading.

The VEA says the adequacy of this teacher should be concealed from the parents of the students in his/her classroom because the information “can be used or misused to make prejudicial judgments about teacher performance.”

Then we have the teachers who are actively harming their students.  As one example, here is Richmond teacher No. 74415, with 25 fourth grade students averaging a reading SGP of 8:

image

Then we also have No. 75318, averaging 8 for 22 fourth grade reading students:

image

The parents of the affected students are not allowed to know who these teachers are.  Indeed, the Virginia “Education” Association would prohibit even my revealing that these teachers exist.

OFFER: I’ll bet you a #2 lead pencil that no child of an RPS teacher, principal, administrator, or School Board member was or will be in 74415’s or 75318’s class.  (But, of course, you are not important enough to have the information to avoid that hazard to your kid.)

Without information for the public to oversee the schools, we know nothing will be done about these and other ineffective teachers:  The assessment system is so pitiful that in 2011 Richmond teachers met or exceeded expectations in 99.28% of the measurements.

Yet VEA says, in effect, “Damn the students!  These teachers might be embarrassed if the parents knew enough to demand their retraining or replacement.”

On its Web site, VEA says:

The mission of the Virginia Education Association is to unite our members and local communities across the Commonwealth in fulfilling the promise of a high quality public education that successfully prepares every single student to realize his or her full potential. We believe this can be accomplished by advocating for students, education professionals, and support professionals.

As to the students who are suffering under inept VEA members and as to the whole notion of “high quality public education,” the threatened VEA suit confesses that this “mission” statement is a shameless lie.  Indeed, the honest name for the organization would be “Virginia Association for the Protection of Incompetent Teachers.”

Why Does VDOE Use Biased Data to Accredit Our Schools?

VDOE has an elaborate scheme to accredit (or not accredit) Virginia’s schools.  The basis is SOL pass rates (plus, for high schools, the graduation rate that depends on passing at least six end-of-course SOL tests).

But we know that the SOL is influenced by economic status.  For example, here are the 2015 reading pass rates by division vs. the percentage of economically disadvantaged students in the division.

We’re not here to discuss whether this correlation suggests that more affluent families live in better school districts, whether their children are better prepared for school, whether their children have higher IQs, or whatever.  The point here is that more affluent kids will show better SOL scores than less affluent students.

That’s only part of the problem with accreditation.  VDOE adjusts (I would say “manipulates”) the accreditation data in secret ways that mostly boost the scores.  In one case, that manipulation converted a 76.3 and a 73.7 into “perfect scores” and embarrassed the Governor.

So it’s no surprise that VDOE has not used, and now is abandoning, a measure of student progress that is insensitive to economic advantage or disadvantage and that might even be resistant to manipulation, the Student Growth Percentile (“SGP”).

VDOE says:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.
A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

VDOE calculated SGPs in reading, math, and algebra for at least three years, ending in 2014. Then they abandoned the SGP for a new measure that looks to be coarser than the SGP. 

VDOE says that the new measure might be useful in the accreditation process because it allows “partial point[s] for growth,” i.e. another way to boost the scores.  There is no mention of sensitivity to economic disadvantage.

How about it, VDOE?  Does your dandy new measure of progress cancel the advantage of the more affluent students?  And if it does, will you use it to replace the SOL in the accreditation process?

Kudos to the VaCU

I’ve been a customer at the Virginia Credit Union for some time, especially for the period awhile back when their CD rates were competitive.

From time to time they offer free document shredding at their Boulders location.  We’ve been frustrated there in the past when the line of cars reached out onto Jahnke Rd.

Disdaining the lessons of the past, we drove there about 08:40 this morning, expecting to wait in line until the 09:00 opening.  We were surprised to find them open early, with two lines and four shredding trucks.  We left our bag of old documents and were gone in a few moments.  Indeed, when we drove past an hour or so later there still was no line visible from Boulders Pkwy.

All Glory to the VaCU, especially to the person there who decided to improve this helpful public service.

Now, if they’ll just bump their 60 month CD rate to match, or even come a bit closer to matching, their online competitors . . .

The Insecure Defending the Unspeakable

It looks like I’m being sued.

Yesterday I received by email a “Petition for Injunction” for VEA and others against VDOE, Brian Davison, and me.  The Petition asks the Richmond Circuit Court to enjoin VDOE from releasing SGP and related data and to prohibit Brian and me from using such data.

Well, as to the name of the court I’m being unduly kind: The Petition is addressed to “The Circuit Court for the City of Richmond” (emphasis supplied).  There is no such court.  By statute, our circuit court is “The Circuit Court of the City of Richmond” (emphasis supplied again).  So, the Petition, if it is genuine, demonstrates at the top of the first page the ignorance of the VEA’s lawyer.

The body of the Petition, entirely aside from any legal merit (actually, lack thereof), illuminates the unfortunate truth that VEA is more interested in protecting incompetent teachers than in furthering the educations of Virginia’s schoolchildren.

Then, at the bottom, the Petition carries a signature block that begins:

Dena Rosenkrantz, Esquire (VSB#28667)
Legal Services Director
VIRGINIA EDUCATION ASSOCIATION

Of course, “Esquire” is a courtesy title, mostly applied to lawyers.  So we have Ms. Rosenkrantz stroking herself with a courtesy title.  As well, the “Esquire” is redundant: The Virginia State Bar number, required by Supreme Court Rule 1:4(l), tells us she is a lawyer.

It’s hard to imagine a lawyer with an ego so shrunken that she feels a need to be courteous to herself and to tell us she is a lawyer lawyer.  But it seems the VEA has found one.

Indeed, it looks like there’s an epidemic of insecurity over there.  The signature block of the purported law clerk who sent the email starts: “Catherine A. Lee, JD.” Really!  A law clerk who feels the need to tell us she has a law degree.  Remarkably, she didn’t attach a law school transcript to show how smart she is or a picture to show how pretty.

Let’s hope this Petition is not a prank and that it will be filed at the courthouse and served on Brian and me.  Dealing with a lawyer at that level of ignorance and with that defect of ego, who is attempting to keep Virginia’s parents from knowing whether their kid is being damaged by an incompetent teacher, should be good fun.

Moreover, VDOE’s lawyer is a competent and affable fellow.  To the extent he is on Brian’s and my side, or even to the extent he isn’t, those qualities will enhance the enjoyment.

——————————————————-

P.S.: I have created a new email account for this litigation: 4students_unlikeVEA@outlook.com.  If you know where and what the Richmond plaintiff, Bradley Mock, teaches or whether he is the same “Bradley Mock” who studied “Hip Hop Culture” at VCU, please do use it to send me an email.

Board of “Education” Ongoing Malfeasance

As I pointed out two days ago, the Board of Education has the duty and authority “to see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth.”  Yet the Board does not even collect information to allow it to assess school or division compliance with the law.

The Board first announced a truancy rulemaking on July 22, 2010; it still is without a regulation.

The most recent proposal was on the April 28 agenda but, come the meeting, the matter was “withdrawn” [see the video at 3:17:38].

Of course, I filed a Freedom of Information Act request for the paperwork undergirding the withdrawal.  Melissa Luchau, the Director of Board Relations for VDOE, responded:

There are no records responsive to your request.

The item was withdrawn from the agenda because the staff with expertise in this area were unable to attend the Board meeting due to health reasons.

That raises more questions than it answers:

  • Is the VDOE bureaucracy so siloed that only the directly involved “staff” (no telling how many) knows/know about the proposed regulation?
  • In particular, does a regulation go to the Board without the Director of Board Relations and her boss not knowing all about it?
  • Still more particularly, are the Director of Board Relations and her boss too busy to get briefed when it appears that the affected “staff” may not be available?
  • Even more particularly, why is the Board extending, yet again, its tolerance of Richmond’s (and doubtless other divisions’) wholesale violations of the mandatory attendance laws?

Your tax dollars “at work.”

Maggie What?

The Times-Dispatch this morning reports that Open and Community high schools have been rated among the top ten schools in Virginia.

That’s no surprise.  Except perhaps as to math and science, both schools do an outstanding job.

image

image

The surprise is that Maggie Walker did not make the list.

The MW Web site [School Profile page] tells us that the Daily Beast ranked Walker 12th best public high school in the nation on August 27, 2014.  Yet, if you go to the VDOE Web site you won’t even find SOL scores for Walker.

Ask VDOE about this and they’ll tell you something like what they told me:

Governor’s Schools are regional centers or programs and do not report membership [what you and I would call enrollment] to the Virginia Department of Education (VDOE). Students who attend these programs are included in the average daily membership (ADM) of the schools they otherwise would have attended in their home division. Only schools that report membership to VDOE are assigned a school number.

The assessments of students attending regional centers and programs are coded with the school numbers of the schools where these students are included in membership. This is a long-standing practice that pre-dates the Standards of Learning (SOL) program.

Note, however, that is not true for Fairfax County’s Thomas Jefferson, the Governor’s School for Science and Technology in Northern Virginia.  

image

image

In short, Maggie Walker, a four-year, full day, public high school that issues diplomas to its graduates, is not a “school.”  The SOL (and other) scores of the MW students are falsely reported at the high schools in those students’ home districts.  So, of course, if you look to the official SOL data, MW does not exist.

Do you wonder why I call VDOE the “State Department of Data Suppression and Manipulation”?

Big Budgets Don’t Teach

There is an unfortunate tendency to measure school quality in terms of inputs, particularly money, albeit the important measure should be the output, how much the kids learn.  The SOL provides a ready, but rough, measure of output.

We have seen that neither division expenditure nor nor excess local financial effort nor average division teacher salary correlates with SOL pass rates.

But we know that SOL scores are sensitive to at least one other variable, the economic status of the students: Economically disadvantaged students score lower, on average.

For about four years, until last year, VDOE produced student growth percentile (“SGP”) data that are not dependent upon economic advantage or disadvantage.  VDOE attempted to hide those data but Brian Davison of Loudoun pried loose some of the data by division.

I’ve taken the latest of those data, for 2014, and juxtaposed them with the teacher salary data for that year to see if there’s anything to learn there.

To start, here are the division reading SGPs plotted vs. the division average teaching salary (regular K-12 education teachers, art, music, physical education, technology, remedial, gifted, mathematics, reading, special education, and ESL teachers; not included in the calculation are: teacher aides, guidance counselors or librarians).

image

The 7.1% R2 tells us there is only a faint correlation between how much the teachers are paid and how much the students learn.

Even so, there is some interesting information in these numbers.

For a start, we see that Richmond (the gold square) and the peer jurisdictions (red circles, from the top Hampton, Norfolk, and Newport News) all are getting better results than the raw SOL pass rates suggest.  Richmond still underperforms, but the peer jurisdictions are close to the middle of the pack, with Hampton just above the 46.6% division median.

The Big Spenders here are the NoVa divisions: the divisions above $60,000 are, from the left, Loudoun, Prince William, Manassas City, Fairfax, Falls Church, Alexandria, and Arlington.  The outstanding performers, all but one with modest salaries, are Poquoson, Bland, Botetourt, Wise, and Falls Church.

The eye opener is the surrounding counties (the green circles, from the top Charles City, Chesterfield, Henrico, and Hanover): Charles City is outperforming the others; the others are not obtaining student growth at any level to brag about.  Of course, all are beating Richmond, but by much less than the raw pass rates would suggest.

The math data are less flattering to Richmond but similarly interesting.  Caveat: These data do not include algebra (for reasons of the maximum number of rows that would fit into Excel).

image

The 3% R2 again indicates no significant correlation between the SGPs and the salaries.

Here we see Richmond, again the gold square, 5.5 percentiles below the math median of 48.4, compared to the 3.1 points below the reading median.  In blue, Newport News and Hampton are above the median; Norfolk is 1.8 points below (and should be dropping unless they are cheating).

The Counties (green circles, from the left Charles City, Hanover, Chesterfield, and Henrico) are in the middle of the pack, except for Henrico, which is 1.8 points below Richmond.

The Big Spenders are getting less for their money here than with reading.

The outstanding performers are, from the top, Bristol, Buckingham, Bland (again), Botetourt (again), and Surry.  We probably should reserve judgment about Botetourt; they were caught cheating wholesale with the VGLA.

The dataset is posted here.

The bottom line: Spending more money on teacher salaries does not correlate with better student learning of reading or math.

But, then, we knew all along that the key to student performance is management, not money.

Categories SGP