What’s Up at TJ?

Let’s take a look at the federal graduation rates vs. the SOL pass rates for Richmond’s mainstream high schools.

I have left out the schools with selected clienteles, Community, Open, and Franklin.  Also Alternative, where the other schools dump their troublesome kids.  That leaves the five mainstream schools, Armstrong, Huguenot, Marshall, TJ, and Wythe.

CAVEAT: Throughout the following discussion, please recall that the numbers are, to an unknown extent, bogus.  For example, all these high schools enjoy SOL boosts from the scores of Maggie Walker students who do not attend any of these schools.  As well, the federal graduation rates this year have been jiggered to count most of the Modified Standard diplomas as Standard.  Moreover, the secret retesting boosts the SOLs (my estimate for the Algebra I retests in 2014 was an average of 24.6 points for the retested kids); only VDOE knows how much and they won’t tell.  Then there is “remediation recovery,” which may be another kind of retesting.  As well, these schools have dumped many of their disruptive kids on Alternative and have chased out many of the other low-performers.

Turning to the data we have: If we just consider the data fit (and not the inexcusably low pass rates and graduation rates), the plot of graduation rate vs. pass rate for the writing tests looks reasonable enough.

image

We could expect the pass rate on this EOC English test, that must be  passed to graduate, to correlate with the graduation rate.  Indeed, the correlation here is excellent.

The reading data show quite another pattern.

image

Here the R-squared drops to 24% – still an R of 0.5 but much reduced from the reading situation.  And here two schools are far out of line: TJ (high) and Wythe (low).

That 69% reading pass rate at TJ is not at all consistent with the 83% graduation rate.

In the other subjects (History & Social Science, Science, and Math), as with the two English subjects, at least one verified crediti.e., passed EOC SOL test – is required for a standard diploma.

Here are the data for those other subjects.

image

image

image

Here we again see lower correlations, with essentially none for the History & Social Science tests.  TJ is anomalously high in all cases with Wythe and Huguenot trading places for anomalously low.

It’s easy to understand Armstrong: Low pass rates and low graduation rates.  Huguenot, Marshall, and Wythe are the middle ground, with Marshall generally outperforming the fitted lines. 

History & Social Science is the extreme case here: The student needs three standard credits (pass the course) and one verified credit (pass the course and the End of Course SOL) in Hist. & SS to get a standard diploma.  Yet TJ with a 58% EOC pass rate (better only than Armstrong) had an 83% diploma rate (best of the bunch).

The puzzle becomes more puzzling when we look at the diploma types.

image

TJ is the only school of the five with more advanced than standard diplomas.  Yet the advanced diploma requires nine verified credits vs. six for the standard diploma.

Looking just at the pass rates, we see TJ leading the pack in writing, the one subject where its graduation rate fits the pattern.

image

In science, TJ is in front a bit (albeit below the accreditation level); it the other subjects, especially history & SS, its pass rate is within the pack.  Yet its graduation rate is outstanding (well, outstanding by Richmond standards; still five points below the state average).

Of course, the students get four years to earn the required verified credits.  And, for the most part, our high schools have done better in the past.

image

Despite being in the dark about all the retakes and adjustments, we get to wonder, especially about History & Social Science at TJ (where the pass rate never reached even the 70% accreditation level during these four years).

Here, for the record, are the four-year histories in the other four subjects.

image

image

image

image

If our Board of “Education” were interested in transparency, we wouldn’t have to wonder whether something funny might be going on here.  And if pigs had wings . . .

More Mischief at VDOE

VDOE is running a “survey

After two preliminary questions (nature of your interest in schools; where you live), the survey (reproduced below) asks how important the responder thinks various factors are “for federal accountability purposes.”

There are fourteen questions about matters such as pass rates, academic growth, graduation rate, dropout rate, etc. 

Then, second from the bottom, there is one about the crucial element in school success, the teachers.  But Our Leaders fall back on the old, misleading stupidity of measuring inputs and ignoring outputs: They ask about teacher credentials, not teacher effectiveness.

This fits into an ugly pattern:

It is clear that SOL performance decreases with increasing poverty.  Under the federal prod, VDOE began calculating Student Growth Percentiles (“SGPs”) that were essentially uncorrelated with economic disadvantage. 

The SGPs proved to be embarrassing to the Virginia “Education” Association and VDOE: They allowed a clear measure of teacher effectiveness (see this and this).  So VDOE abandoned ship and introduced “Progress Tables” that roughly measure student progress but mostly ignore the lack of progress.

Now they ask about the importance of teacher credentials (that do not measure teacher effectiveness) but they ignore even their diluted measure of teacher effectiveness.

Guess what: You and I get taxed to pay for this deliberate sidelining of the most important public influence on our schoolchildren, teacher effectiveness.

—————————–

Here is the meat of the survey form:

image

Maggie What?

The Times-Dispatch reports this morning that US News & World Report ranks Community and Open 9th and 10th in Virginia based on college readiness.

First place in the state is Fairfax County’s TJ (actually located in Alexandria!), a Governor’s School

Absent from the list is Maggie Walker, also a Governor’s School.

“What?” you say!  Maggie Walker is a public high school for high-ability students, issues diplomas to its graduates, is governed by a school board comprised of representatives from twelve local school systems, and is accredited as a “school.” 

But VDOE says it’s a “program,” not a “school.”  (While Governor’s School TJ in Alexandria is a “school.”)

AND, since MLW is not a school, the SOL scores of its students are reported to the high schools in their home districts [note: old document; today Pearson surely reports the scores directly], albeit they do not attend those schools.

Do you suppose the feds know that VDOE is lying to them about the SOL scores of our local high schools?  Do you care that VDOE brokered this corrupt deal so the local Superintendents would let their bright kids go to MLW without lowering the SOLs of the local high schools?

BTW: The Daily Beast has better information: In ‘14 they ranked MLW #12 public school in the nation.

Your Tax Dollars at “Work”

The June, 2016 Post from our Superintendent discusses at some length the results of the current (2017, based on 2016 data) accreditation results.  The VDOE Accreditation page, last updated on Feb. 29, 2016, shows the data from last year but not the current numbers.

Why do you suppose that the Richmond Superintendent had those numbers some time last month but you and I still cannot get them?

Indeed, why do you suppose VDOE could use the SOL data to assess accreditation in June (or before) but cannot use the same data to calculate SGP until Fall [pdf at slide 2]?

Stay tuned while I try to find out. 

In the meantime, consider the possibility that VDOE is, among other things, the State Department of Superintendent Protection far more than the Department of “Education.”

Economic Disadvantage and Richmond’s Awful Middle Schools

We’ve seen that division SOL pass rates fall with increasing economic disadvantage.  Those data also suggest that Richmond’s gross underperformance is not explained by the economic disadvantage of the Richmond students.

Drilling further into the relationship between academic performance and economic disadvantage (ED for short), the reading pass rates of Richmond’s elementary schools show a moderate correlation with ED and the mathematics a weak correlation but our middle and high schools show considerably more robust correlations:

image

image

image

Here are the SOL/ED data:

image

image

image

image

Note: Franklin has both middle and high school grades; I omit it from the graphs because it does not directly compare to either kind of school.

Caveat: Correlation is a necessary but not sufficient condition to infer causation.

The other thing to notice about the middle schools is the very low pass rates.  Here, for reference, are the average pass rates by grade.  The horizontal lines are the reading and math “benchmarks” for accreditation.

image

Why do the middle schools get much lower SOL pass rates with mostly the same kids as the elementary schools?  Let’s infer that the middle schools are doing a much worse job.  See below.

In any case, the R2s imply that the SOL is affected, especially in the middle and high schools, by economic condition or something related to it.

The Student Growth Percentile (SGP) was supposed to remove that correlation so I turned to the latest available data, the 2014 data by school in the 2d download from VDOE in response to Brian Davison’s suit.

There are no high school reading or mathematics data for Richmond in that dataset (EOC Algebra I only) but the elementary and middle school results are compelling. 

image

Here we see our elementary schools performing at bout the 50th percentile on math and a notch lower on reading.  Those performances were mostly uncorrelated with ED (reading R2 of 1%; math, 3%).  The Good News: These learning measures, esp. the reading, are a bit better than the SOL pass rates might suggest.

The school with a reading SGP of 71 (!) is Carver; the 63 is Jones.  As to math, we have six schools above the 60th percentile (Ginter Park at 70; Fisher, 67; Carver, 66; Jones, 65; Munford, 64; and Greene, 62), with Reid in the basement at 32.  That collection of reading STPs just under 40 is not encouraging.

Caveat: These data use the whole school %ED from the Fall census.  The VDOE data would allow calculation for only the SGP grades, 4 & 5, except that their data suppression rules give blank ED values for Munford and Henry by suppressing the fifth grade data (fewer than ten kids reported).  The totals are larger than the sums for the individual grades and presumably include all the ED students so I’ll stick with the (presumably undoctored) total data.

Here are the data:

image

The two very low ED schools are Munford at 10%, performing well above the 50th percentile, and Fox at 22% ED scoring at the 50th percentile in reading but only the 44th in math.  This makes it look like those nice SOLs at Fox are the result of smart kids who are scoring well but not improving as much as the smart kids in other schools.

The 24th percentile score in math is Reid.

The conclusion: On the 2014 data, our elementary schools are doing an average job, on average.  There’s work to be done at Reid and some others but, all in all, the SGPs report more learning than the SOLs might suggest.

And how much the kids learned was generally unrelated to economic disadvantage.

The middle schools were an unhappier story:

image

image

The database let me pull the 6th, 7th, and 8th grade data so I’ve included Franklin.

Note the low average performance and the modest correlation of the math scores.  Also notice the absence of schools with low ED populations.

As to that last point, these data raise the question whether those low ED kids from Munford and Fox have dropped out or gone to the Counties or to private schools for middle school or whether their numbers just disappear into the average.

To that issue here, first, are the totals:

image

And here are the details:

image

Or, relative to the 9th grade memberships:

image

VDOE publishes no data on kids who drop out before entering middle school  The data they do share indicate zero dropouts from grades 7 or 8 in 2014.  That seems unlikely but it’s all the information we have.

We are left with the possibility that the middle school drop in membership and rise in %ED reflects some of the more affluent kids fleeing to private schools and to the Counties.  The precipitous drops in both total and ED membership after the 9th grade surely come from dropouts.

But to revisit the major point: The low correlations with ED tell us that the low middle school SGPs can’t be caused by the increased economic disadvantage; the leading candidates for those lousy SGPs, then, are lousy teaching and/or lousy administrators who fail to control the middle schools.

The other point here: The State Department of Data Suppression has stopped calculating SGPs, which leaves us with the manifestly flawed SOL data to assess school (and teacher) quality.  It seems we’ll have to wait until late summer to see whether they are going to release or suppress their new progress (aka “value”) tables that measure academic progress (but mostly ignore the lack of it).

SOL vs. Economic Disadvantage, Again

VDOE had the 2016 SOL scores in time to decide who could graduate in May.  But they won’t release those scores to the taxpayers who paid for them until late Summer. 

So let’s look a bit harder at the data we do have.

We’ve seen that division pass rates fall with increasing economic disadvantage (no surprise there).  Those data also suggest that Richmond’s gross underperformance is not explained by the ED of the Richmond students.

image

image

Richmond is the gold square on both graphs.

Let’s break out the division pass rates by pass proficient and pass advanced.  Here are the reading data.

image

Hmmmm.  Here we see that the division reading test correlation is driven by the even better correlation of the pass advanced data.  There is essentially no correlation between the pass proficient rate and economic disadvantage. 

As to pass advanced, Richmond, the gold square, is almost on the fitted line, and the peer jurisdictions Hampton, Newport News, and Norfolk (the red diamonds, from the left) are nearby.  But Richmond is second from the bottom on pass proficient and this drags the City down to second from the bottom overall.  Among the peers, Norfolk looks to underperform as to the proficient rate, Hampton is about as expected, and Newport News is in between.

The data pair at 7% ED is Falls Church, which overperforms as to advanced and underperforms as to proficient.  The data at 13% are Lexington, which does the same, only moreso.   

Turning to the math data, we see a similar picture except the pass advanced correlation coefficient decreases.

image

Here, Richmond underperforms as to both proficient and advanced. 

These data suggest:

  • Poverty vel non does not excuse RPS’s lousy performance (no surprise there);
  • RPS is doing fairly well by (or, at least, not visiting too much harm upon) its better performing kids; and
  • If RPS sensibly wants to attack its performance problem where it is worst, they will work to do a better job with the marginal students.

The Empire Strikes Back. Feebly.

On June 2, Donald Wilms, President of the Chesterfield Education Association, responded in the Times-Dispatch to Bart Hinkle’s editorial of May 28.

Hinkle had made the point that the Virginia Education Association’s attempt to suppress truthful data on teacher effectiveness sought to keep “parents and taxpayers . . . in the dark about which teachers are doing a great job – and which ones aren’t.” 

Wilms sought to argue that the data should be kept secret.  He mostly demonstrated that the Chesterfield Education Association needs a better argument.

Wilms brought on some emotional arm-waving about the students who may come to school on test day after oversleeping and missing breakfast; after a fight; after losing a boy-/girlfriend; or the like.  He neglected to mention that the data he disdains are based on two (or more) successive years’ test scores:  An outside event in the second year could lower a student’s measured progress but the same event in the first year could increase the score difference and, thus, the progress measure.

The nut of Wilms’ argument, however, was the kids who are at a systematic disadvantage:

[W]ould it be fair for schools with high-income households — where both parents are college-educated, where kids go to museums and take exotic vacations, where parents have two cars and are available to take kids to the public library — to compete with schools where kids live in single-parent households, where parents hold several low-wage jobs, with hours preventing them from being home to take kids to museums or public libraries, and incomes preventing them from any vacation at all? Heck, it would be unfair to compare a school in western Chesterfield with one in eastern Chesterfield, let alone to compare one of Chesterfield’s (or Henrico’s) wealthiest school communities with one of Richmond’s neediest school communities, don’t you think?

* * *

What these [SGP data] really illustrate is which teachers have the at-risk kids and which don’t.

On these emotional grounds, Wilms attacked the fairness of the “flawed rankings” of the Student Growth Percentile, the “SGP” (and, apparently, other measures of teaching effectiveness).  He disdained to discuss the actual data, which are to the contrary.

When it was considering adopting the SGP, VDOE provided data showing that test scores fall with increasing economic disadvantage (no surprise there) but that SGP percentiles do not:

image

That is because the SGP, by design, compares low-performers only to other low-performers.  Unlike the SOL, it measures progress relative to others similarly situated.

Indeed, the data for Chesterfield County make a clear case that students who score poorly one year, for whatever reason, can show outstanding academic growth the next year.  Here, for instance, are the 2014 math SGP scores of Chesterfield students plotted against those same students’ 2013 SOL scores.

image

Before drawing any conclusions, we need to refine the analysis slightly: Students who score pass, advanced (SOL 500 or above) for two years running do not receive an SGP (it’s hard to improve on “excellent”).  In a sense, the SGP penalizes teachers and schools with large numbers of very bright students!

The SOL scores ≥ 500 in the graph above represent students who scored advanced 2013 but less than advanced in 2014.  The students who scored advanced in both ‘13 and ‘14 do not appear in this dataset at all, which biases the analysis.  So let’s look only at the students with 2013 SOLs < 500:

image

The result doesn’t change much.  The R2 value of 0.14% tells us that the SGPs are quite uncorrelated with the previous year’s SOL scores.  Of course, correlation is necessary but not sufficient to show causation.  Said otherwise: Chesterfield students with low SOL scores in the previous year, for whatever reason, show superior (or inferior) academic growth (as measured by the SGP) in the current year just as often as students who scored well in the previous year.

We can’t directly test Wilms’ statement about pitting a rich school against a poor one because VDOE (known here as the Virginia Department of Data Suppression) hasn’t released (and won’t release) the data.  So let’s go one better: Let’s use the data we have to compare some of the worst students in Chesterfield in terms of 2013 SOL with some of the best.

Using the math data, here is the first group, all at 50 points or more below the passing score of 400:

image

As before, the 2014 SGP does not correlate with the 2013 SOL. 

Next the 2013 SOLs between 450 (50 points above the passing score) and 499 (where we stop to avoid the advanced student penalty).

image

Pretty much the same story: High or low SOL one year does not predict SGP growth in the next year.

Said yet otherwise: Part of teaching in the public schools is dealing with kids who are disadvantaged; the SGP identifies teachers who do that well; the SOL may not.

BTW: The reading data tell the same story with slightly different numbers.  Let’s avoid some clutter and leave them out. 

It’s easy to understand why Wilms would prefer the current system.  In 2011 (the only year that VDOE slipped up and posted evaluation data), Chesterfield teachers were evaluated on seven criteria.  Six of those were inputs, i.e., relatively easy to measure but only distantly related to student learning.  On the only important measure, “Student Achievement and Academic Progress,” only twelve of 1222 Chesterfield teachers (1%) were said to need improvement and none was unsatisfactory. 

image

But when we look at real performance data (2012 SGPs are the earliest that Brian Davison sued out of VDOE), we see student progress that was much worse than “unsatisfactory.”  For example here is the math performance of Teacher No. 34785 (identifier anonymized by VDOE):

image

The yellow points are the annual SGP averages of that teacher’s math students.  The blue bars show the 95% confidence intervals. 

The Chesterfield math averages those years were 48.7, 49.2, and 48.2.

Then we have Nos. 54317 and 86898:

image  image

To put some context on these numbers, here is the 2014 statewide distribution of math SGP averages by teacher.

image

The mean was 49.3; the estimated standard deviation, 16.3.  That is, about sixteen percent of the teachers were below 33.0; another sixteen percent, above 65.6.  Two and a half percent were below 16.7; another 2.5%, above 81.9. I’ve marked those values on the 2014 Chesterfield distribution:

image

The problem here is not “flawed rankings.”  It is flawed teachers that the Chesterfield “Education” Association does not want the parents of Chesterfield County to know about.

BTW: The data also show that there are some really fine teachers in Chesterfield.  For example, in math we see Nos. 29893, 28974, and 81816:

image image image

The Chesterfield and Virginia “Education” Associations don’t want you to know about these outstanding teachers, either.  They just want you to think that 99% are above average.

And VDOE is a co-conspirator.  Your tax dollars “at work.”

Yet More of What VEA and VDOE Are Trying to Hide

In the discussion of the SGP data that Brian Davison sued out of the State Department of Data Suppression, I’ve been focused on the awful teaching (e.g., here, here, and here) that the Virginia Department of “Education” and the Virginia “Education” Association have been attempting to conceal.  But their efforts to hide teacher performance data from the taxpayers who are paying the teachers have another outrageous effect: They suppress the identities of the many great teachers in Virginia’s public schools.

Having looked at the 43 teachers with the worst three-year average math SGPs, let’s turn to the 43 with the best averages:

image

The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three year average for each teacher.  The “Grand Total” row reports the statewide average of each column.

All but three of the 43 teachers at the bottom of the SGP list damaged Virginia schoolchildren for only one year; almost half of the teachers in the present list helped educate Virginia schoolchildren for more than one year.

Here are the Top Ten who taught all three of the years for which we have data.

image

Notice that all but two of these got better performances from their students in 2014 than in 2012.  And, of those two, No. 115415’s even 99 in 2012 and much lower average suggest only one student (of doubtful statistical significance) in the first year and substantial progress over the second two years.

The preponderance of NoVa suburbs in this list raises the question whether those divisions have better students, or better teachers, or some combination.  See below for some data suggesting that there is more learning, and presumably more teaching, is in some more rural districts.  In the meantime, here are data for the Top Ten in graphical form.

image

Or, with the ordinate expanded:

image

The division averages provide some further insights.  Let’s start with the distribution of division mathematics averages for 2014.

image

As shown above, the mean math SGP in 2014 was 49.1.  The division mean was 48.3, with a standard deviation of 6.2.

The Top Ten divisions of 2014 did not include any of the NoVa suburbs.

image

Neither, for that matter, did the bottom ten.

image

Here is the entire 2014 math list, sorted by division (Colonial Beach, Craig, and Surry had no data).

wp352i5z

When we turn to the division averages by year an interesting pattern emerges: The Top Ten in 2014 all improved from 2012.

image

image

Seems to me that a whole bunch of educators should be put on a bus to one of these divisions to find out how to increase student performance in math.

The Bottom Ten in 2014 mostly declined from 2012, except for Franklin and West Point, which improved; King and Queen, which improved slightly; and Galax, which only had data for 2014.

image

image

But the Virginia Department of “Education” and the Virginia “Education” Association don’t want you to see these data.  Apparently you are not qualified to know what you are getting (or not getting) for your tax dollar.

Why Does VDOE Use Biased Data to Accredit Our Schools?

VDOE has an elaborate scheme to accredit (or not accredit) Virginia’s schools.  The basis is SOL pass rates (plus, for high schools, the graduation rate that depends on passing at least six end-of-course SOL tests).

But we know that the SOL is influenced by economic status.  For example, here are the 2015 reading pass rates by division vs. the percentage of economically disadvantaged students in the division.

We’re not here to discuss whether this correlation suggests that more affluent families live in better school districts, whether their children are better prepared for school, whether their children have higher IQs, or whatever.  The point here is that more affluent kids will show better SOL scores than less affluent students.

That’s only part of the problem with accreditation.  VDOE adjusts (I would say “manipulates”) the accreditation data in secret ways that mostly boost the scores.  In one case, that manipulation converted a 76.3 and a 73.7 into “perfect scores” and embarrassed the Governor.

So it’s no surprise that VDOE has not used, and now is abandoning, a measure of student progress that is insensitive to economic advantage or disadvantage and that might even be resistant to manipulation, the Student Growth Percentile (“SGP”).

VDOE says:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.
A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

VDOE calculated SGPs in reading, math, and algebra for at least three years, ending in 2014. Then they abandoned the SGP for a new measure that looks to be coarser than the SGP. 

VDOE says that the new measure might be useful in the accreditation process because it allows “partial point[s] for growth,” i.e. another way to boost the scores.  There is no mention of sensitivity to economic disadvantage.

How about it, VDOE?  Does your dandy new measure of progress cancel the advantage of the more affluent students?  And if it does, will you use it to replace the SOL in the accreditation process?

Bless You, Brian Davison!

[For background, see this.]

From: John Butcher
Sent: 4/29/2016 5:46 PM
To: Pyle, Charles (DOE)
Subject: FOIA Request

Mr. Pyle,

I am a Citizen of the Commonwealth and a resident of the City of Richmond at the address set out below.  Under the authority of the Virginia Freedom of Information Act, I request an opportunity to inspect and copy the following public records, as that term is defined at Va. Code § 2.2-3701, that are prepared, owned, or in the possession of the Department of Education:

•    All records that establish or comment upon any reason why it might not be practically possible to provide the records I requested on April 21 within the five work days provided by the Act.

If any record responsive to this request exists in electronic form, I request that you provide it by posting it to the Department’s web site or emailing it to me at the return address above.

In the event the Department elects to withhold any public record responsive to this request, for each such record please:

•    Identify the record withheld by date, author, title, and summary or purpose of the record;

•    Identify all persons outside your Department to whom the record has been shown or to whom copies have been furnished; and

•    State specifically the statutory exemption under which the Department elects to withhold the record.

If you elect to charge me part or all of the actual cost incurred in accessing, duplicating, supplying, or searching for the requested records, please estimate the total charges beforehand.  If those total charges exceed $100, please notify me before you incur the costs.

Please contact me by telephone at the number below or by email at the address above if I can answer any question about this request.

I look forward to hearing from you as promptly as possible and in any event within the five work days provided by the Act.

P.S.: Notwithstanding your email of today, my request was made at 1:31 PM on April 21, not on April 19.  Today being the sixth work day since the request (although I rather like the April 19 date, which would make today the eighth day), the Department is in violation of the Act and has waived any objection to providing the requested records.  Please just send along the records.

 

On 4/29/2016 4:51 PM, Pyle, Charles (DOE) wrote:
>
> Dear Mr. Butcher:
>
> The Virginia Department of Education (VDOE) is in receipt of your request for records dated April 19, 2016, and made in accordance with the Virginia Freedom of Information Act (§ 2.2-3700 et seq.).
>
> Please be advised that it is not practically possible to provide the requested records or determine their availability within the five working days required by FOIA due to the unavailability of staff and the complexity of your request.  Therefore, VDOE is invoking subsection B 4 of § 2.2-3704 of the Code of Virginia to provide the agency with seven additional working days to respond to your request.
>
> Best regards,
> Charles B. Pyle
> Director of Communications
> Virginia Department of Education
> (804) 371-2420
> Charles.Pyle@doe.Virginia.gov