Education data reporting is very expensive, time-consuming, large, and imperfect in Virginia

by James C. Sherlock

In government, there is rarely an unbiased and objective assessment of what is working and what is not. Just eternal programs joined by new ones.

We citizens all hope that when it comes to governance and budgeting, Virginia rewards programs that prove to be effective and efficient and enforces a cut line for those that don’t. But we know that happens much less often than it should.

We also know that the Governor, his budget officers, and the General Assembly would appreciate objectively measured cut lines. The same would apply to the budget directors of the agencies.

When it comes to education, the government needs three things it usually doesn’t have:

  • measurable goals;
  • quality data to measure them; and
  • the ability to generate data and evaluate it much less expensively and with better quality than we do now.

Education. When Virginia’s educational data collection system was last inspected, the Virginia Inspector General’s Office agreed with me. And that report concerned only direct aid to education. It was a bloodbath over data quality.

Hopefully gains have been made since then, but data requirements have also exploded. Nothing fixed the very heavy burden of reporting. According to my surveys and reports, the quality of the data is still very poor.

The rapidly proliferating K-12 education programs in Virginia are a big part of the problem. The number of new programs requiring a Virginia Board of Education (BOE) approved report over the past year and a half must have crushed the all-time record. If administrative overhead is on the BOE checklist in new program approvals, there is no evidence that it is honored.

The data that the VDOE receives seems, from my research and reports, to be late and worse than useless for the purposes of program evaluation. I say worse that useless because they are often to see good enough to use – and are used.

I’ll discuss the problem and then suggest an approach that will try to fix the problem at both ends:

  • too many hours spent on data collection and reporting at all levels;
  • too many people authorized to enter and modify data; and
  • far too little quality assurance before the data reaches the VDOE and the Federal Government.

I’ll go to a terrific editorial by Matt Hurt in Roanoke time. The title, “Adding more and more initiatives worsens education outcomes” captures the point.

We started with the hope of making sure our students could read, write and do math, and then all sorts of other things were added.

In recent years, initiatives (not all-inclusive) have been added to educators’ plates by the Virginia General Assembly and the Virginia Board of Education, in addition to whatever was already there. As of fall 2021, here’s the list: year-round “growth” assessments; balanced evaluation plans; implementing new educator evaluation standards, new social-emotional standards, cultural competency training, and new IT standards; the implementation of model policies regarding the treatment of transgender students; the Virginia Kindergarten Readiness Program; federal programs related to pandemic relief funds; reviews of locally assigned verified credits; new oneassessment program for students with significant cognitive impairment; and implementing Virginia’s Inclusive Self-Assessment and Action Planning Tool.

These are mostly Virginia schools of education – mostly UVa and VCU – brilliant ideas. Some actually try to help. Most try to apply awakened doctrine at the K-12 level. They don’t care if their programs work and don’t want them to be measured.

Read the story Matt tells about the Superintendent of Wise Schools working 12 hour days, a schedule determined by reporting requirements and lack of staff to compile required reports rather than her leadership responsibilities, on which she would like to spend more time.

The VDOE and BOE, in the absence of reliable data on what is going on in schools, may be further influenced by dog ​​training schools that form their “expert” panels. Again the leaders are the schools of education, again dominated by UVa and VCU.

To this I will add that last year VDOE almost instantly created a very large online education program, Virtual Virginia.

  • We are spending tens of millions where we once spent very little;
  • It adds even more reporting requirements for schools;
  • We have no objective idea of ​​its value, especially compared to the successful virtual public schools already serving Virginia. They have already responded to this need and done their own reporting; and
  • The true costs of the VDOE effort are buried in state and district school budgets. The VDOE, the Governor’s Office and the General Assembly are unlikely to have a clear idea of ​​the true costs.

Measurable goals. Matt tells the story of the Wise County Superintendent of Schools who, with no money to hire extra staff to handle all of these programs and their reports, was working 12 hour days which was not enough. Then he writes:

Our leadership in Richmond (Governor, School Board and General Assembly) really needs to look carefully at our educational priorities and develop a unified hierarchy that contains measurable goals. Any program or initiative that is not aligned with the top priority(s) must be tabled until the top priorities are met. (Emphasis added)

How does VDOE get good data for these “Measurable Goals”?

Measurable objectives must be defined. Once defined, data requirements can be programmed into reporting software. The same goes for basic data assessments.

Almost all of the data that reaches the governor’s office, the General Assembly, and the federal government (and school districts for district-initiated reports) comes from individual schools. Data requirements overwhelm schools. We cannot trust results as a whole, even if and when measurable goals are set.

So we need to integrate and greatly simplify the process.

What are we doing? Virginia has 132 school districts that want (or maybe not in the smaller districts) to get their hands on the data from every district of our more than 2,100 public schools to massage them.

  • Rule 1: My first suggestion for a data architecture is to gather and validate the data at the source closest to the problem and not give anyone other than the author the power to change it. Translated for K-12 teaching: let the districts read it, but don’t let them write it.
  • Tailor-made interfaces. To make a major contribution to reducing labor, system analysts and coders in a data center will need to create interfaces tailored to each grade level and level of access credential.
  • Security and access controls. For data entry users, the interface would present the data entry fields required each day in a view that integrates them all without reference to the program that requires them, but rather the entire entry requirement.
  • Data as a shared asset. Each data point would be entered once, regardless of the number of program reports using the data. The reports would be created by the system, not by the human entering the data.
  • Authority and responsibility. The school principal would have the power to oversee data entry, with the authority and ability to delegate his powers to his staff. But he or she would remain responsible for the quality of the data and would be informed daily of problems with the entries.
  • Quality assurance. Rule #1 states that we won’t let the districts touch the data, just read it. (Yes, I repeated it for emphasis. They can thank me later.) Data quality assurance should take place at the data center level with algorithms that check for gaps and potential errors in data entry (identified by probabilities and trends learned by the algorithms) and feed them back to the author for validation or modification. The null entries that dominate the daily reports should be filled in automatically when submitting new data.
  • Artificial intelligence and machine learning. Artificial intelligence and machine learning will be key enabling technologies at this scale.
  • Data control. Don’t allow data to be copied or moved, just grab and read as authorized reports.

Do it. Not rocket science, but this approach will work.

These fundamentals will ensure that the reports generated will be more accurate and complete than today. And they will be generated by a huge reduction in working hours at all levels.

How to achieve it?

The VDOE will have to hire a subcontractor to develop and manage the architecture at this scale. The device is thick.

This contractor can describe the enterprise architecture at all levels and continuously test and improve it in a pilot project.

More complete, more accurate, faster, and cheaper data with far less human labor are very good goals for data collected across Virginia’s more than 2,100 public schools.

The elimination of programs that do not make the cut will be permitted because decision-makers, having described measurable objectives, will be able to rely on the data to evaluate them.

Laws? This may require a change in the law. If so, change it.

Money? It will be expensive to set up and operate. But, translating the time into costs, cheaper than what we do now.

Ask the Federal Department of Education, an end user of much of the data, to fund it as a demonstration project for the rest of the states. I think they would rush to do it.

Results. Done correctly, this approach can significantly reduce data entry requirements for schools and divisions, eliminate manual compilation of reports by schools and divisions in most cases, and improve the quality of reports.

I don’t know who will oppose it, but some groups will.

Ask them why.

Comments are closed.