Delaware leads the charge on data integrity

Five State Auditors were on the task force to develop a framework for evaluating state COVID-19 data. Screenshot of the auditing template provided by Delaware State Auditor Kathy McGuiness.

This past Monday, I had the pleasure of speaking to Delaware State Auditor Kathy McGuiness. Auditor McGuiness was elected to her position in 2018, and she hit the ground running by implementing new ways for Delaware residents to report fraud and keep track of how their taxpayer dollars were being spent.

Now, State Auditor McGuiness is focused on COVID-19. She spearheaded the creation of a standardized template that she and other state auditors will use to evaluate their states’ COVID-19 data collection and reporting. The template, created in collaboration with auditors from Florida, Mississippi, Ohio, and Pennsylvania, is a rubric which watchdog offices may use as a baseline in determining which datasets they examine and which questions they ask of state politicians and public health officials.

State Auditor McGuiness sent me a copy of this rubric; I’ll go over its strengths and drawbacks later in this issue. But first, here’s our conversation (lightly edited and condensed for clarity).


Interview

Betsy Ladyzhets: What is a State Auditor?

State Auditor Kathy McGuiness: We’re the ones looking for accountability and transparency, in regards to taxpayer dollars—to identify fraud, waste, and abuse.

BL: What does your job usually look like?  Are you usually conducting investigations like this?

KM: We do investigations, we have the fraud hotline. We triage cases, we document, and then sometimes it leads to an investigation, sometimes it leads to a referral.

BL: What was the impetus for this investigation to look into COVID data collection? Was there a particular event that alerted you to the issue of COVID data across states?

KM: I noticed there were inconsistencies in it being reported nationwide. Everything from the case numbers, to tests being inaccurately counted. I noticed on several national calls that there were other states that were looking for similar bits of information… So who better to look at this information from an independent perspective than the other state fiscal watchdogs, right?

And every state has different rules. You have some [state auditors] that are chairing oversight committees, or other committees… You also have several states where they’re legislative auditors, they don’t necessarily do things like this. They might just take their audit plan every year from the direction of their general assembly. Half [the auditors] are elected in the country, and half are not. So I knew not everyone is going to be able to participate or even get the permission.

There’s a national group called NASACT—the National Association of State Auditors, Comptrollers and Treasurers. I went through their organization, pitched this idea of doing a national template, and asked them, let me do a survey first and see if there’s interest. By the next day, I had a couple of questions, they shot it out in a mass email. I had 29 states respond and 21 show interest immediately. That’s when I comprised the task force with Florida, Mississippi, Ohio, and Pennsylvania. A multi-state effort. Since then, other states have come on with interest, some wanting to be in the process, some not.

The whole idea was to focus on this collection, monitoring and reporting, for several reasons. Instead of several states maybe wasting their time and resources asking similar questions—we have one baseline where we’re all asking the same questions and trying to find out how our state has done. Down the line, we’ll be able to look at each other, and see how others did, and maybe there are areas of improvement where we can capture a process that we didn’t think of.

BL: You mentioned that in some states, the state auditor or a similar position to what you have, that position is elected, and in other places it’s appointed. Were there any auditors on the task force who faced pushback from other state leaders?

KM: No, not at all. I mean, there’s a couple [of auditors] who said, “I’d like to participate but I can’t, that’s not how my job works.”  And I believe we had another legislative auditor say, “I still want to do this. I’m going to ask for permission, but I won’t be able to until my General Assembly goes back in session.”  So, it’s all over the map.

BL: Yeah. I’m curious about Florida in particular because there’s been a lot of tension there. There’s Rebekah Jones, who was a whistleblower scientist who worked at Florida’s public health department, who said she was fired for refusing to manipulate their data.

KM: Wow.

BL: So I was very curious to see that state on the list.

KM: I didn’t know about this… I met their auditor, Sherrill Norman, at a NASACT conference. She came to mind especially, because when I put this together I was looking for diversity in the group. But also, who has a performance audit division?  Not all states do… When you talk to me, we only have 27 employees, and when you talk to Pennsylvania and they have 400—they have more resources than I do, and I wanted to move this along in a timely manner. There was no politics in the reason [Florida] was picked, if that makes sense.

BL: Yeah, that makes sense. I guess I’ll have to watch and see if Florida publishes the results of their audit.

KM: They will. Or—they should. These [results] are going to be like any other report, or audit, or engagement, or examination, or inspection, or investigation. You make it public.

BL: Could you tell me a bit more about how the framework was developed, or what the collaboration was like between you and the other State Auditors who worked on this? Were there any specific topics that were focuses if you did research, or if you consulted experts, what was that process like?

KM: We had the experts—we had the lead performance audit teams. Our intent was, or is, that the states will be able to evaluate their own data. Then, if they so choose, compare with the other ones. But with these controls in place, this will paint a more accurate and conclusive picture, hopefully, with our efforts to combat this virus and control the spread.

BL: Are there particular goals in terms of say, looking at case data, or looking at deaths data, or anything else? Or are you broadly trying to figure out what will make it more accurate and more efficient?

KM: That’s exactly what we’re trying to do. That’s the whole intention here. Because things have been all over the map. And [the audit] just wants to accurately certify monitoring this data. It wants to be able to say, this is independent, we’re not political, we’re not driven by anything or anyone, and we’re all asking the same questions. Keeping in mind, not everyone’s going to be able to get to the answer.

BL: Right.

KM: Because sometimes [state auditors] have access to certain information, and sometimes they don’t.

BL: Is there an example of a piece of information where that differs, depending on how the state’s set up?

KM: Yes—one state might have a MOU, a memorandum of understanding, with an agency to be able to gather their data and look at it. I know New York[‘s auditor] can do that with the Medicaid data. They have constant access to it. There’s other [state auditors’ offices] that have to have agreements, there’s other states that don’t have access to certain data from their states. Every audit shop is different. [This framework] is just trying to give us something that we can all do together, and really be able to have a true comparison.

We’re gonna do the best we can, and obviously I’m going to report my data, put it out there. And say another state wants to compare against Delaware—well, they may not be able to on all points. It depends on what information is available, or granted.

BL: Right. So, I volunteer at the COVID Tracking Project, where we compile data from every state. And one major challenge we face is the huge differences in data definitions. Some states define their COVID case number as including confirmed and suspected cases, other states only include confirmed cases, some they don’t provide a definition at all. I’m curious about how the auditing framework addresses these kinds of different definitions.

KM: That’s one of the challenges. You’re faced with inconsistent data collection across all the public health departments, and that’s something we will review. We want to find out what is publicly available and what is not. And that also helps states compare with best practices. We’re just looking for a unified approach with the common metrics.

BL: Yeah. Ideally, every state would be using the same definition. But does every state have the data available to use the same definition?  Some states just don’t track suspected cases.

KM: Correct. Well, I—they shall remain nameless, but they are participating, so you do want to watch this over the next few months. It was my understanding that one state isn’t even testing in nursing homes.

BL: Wow. I definitely will look out for that.

KM: Yeah. I was like, what? How do you do that? How is that possible? I’m not—making fun, I’m not poking blame. What I’m saying is, [we auditors want to look into] how each state reports their metrics and how they’re doing, so that we can determine best practices on our own.

BL: How does the auditing framework account for the fact that each state is facing different challenges because of its population or its geography? Like, I know Ohio has had major outbreaks in its prison populationsFlorida had to shut down testing sites because of the hurricane recently. How is that accounted for in the fact that you’re trying to use the same framework in these different states?

KM: That’s a good question. I do know that one audit shop already spoke to going a little bit further. So, the auditing framework is flexible. We have our baseline, what we are all agreeing to ask. If they want to go further, they can.

Given the resource constraints of each auditor, we felt it was—we wanted to have something narrow enough that everyone could do in a timely fashion, and generic enough that everyone could do in a timely fashion. But if they wanted to go look further, they had the resources and time, it’s up to them.

BL: Do you have a prospective timeline for this? For Delaware specifically or in general, are there any requirements or timing goals?

KM: Yes. We have—obviously, like everyone else, we’re in the middle of several projects, many got held up… But this has become a priority. There’s a team assigned, and they are moving forward. We’re hoping that, in the next three months, we can have something.

I know that a couple of other states have moved forward with this, and some have not been able to even begin the process of initiating this audit. The times are going to vary depending on the audit shops, and their audit plans, and their resources. And when I say “audit plan”—you usually have our audit plan ready by the time your next fiscal year is starting.

BL: I see. Within three months, that’s pretty good for such a thorough project.

KM: It’s aggressive. But we’re ready!  It’s about acquiring information.

BL: And how will the results of the audit impact the states that are evaluated? In Delaware, or if you have any indication of what might happen in other places.

KM: The hope is that this template and subsequent state reviews validate efforts and certify the integrity of the data. And when it doesn’t, that those shortcomings can be quickly remedied. Again, we’re looking towards the future. Looking towards best practices. And I think this initiative will help policy-makers and help public health officials to better prepare by giving power to the data and knowledge. Truth and transparency.

BL: Power to the data!

KM: That’s right.

BL: That’s what we want! We want consistency, trust.

KM: We do. And we need transparency and accountability. People say they want it, right?  So we’re gonna give it to them. And some states may have recommended changes to data collection based upon these audit results.

The office I took over was more of a “gotcha” office. “Ah, I gotcha, look what you did wrong.” And that’s not where I’m coming from. I’m coming from, partnerships, and collaboration, and there’s always room to improve. I don’t think there’s any penalty here, if any states fare poorly. We will be able to compare and contrast, and look at best practices, but we can all do better and work together to implement best practices for other people.


Analysis

After our interview, I looked at the data audit template itself.

The template focuses on four key components: data collection, data quality, communication, and best practices. As State Auditor McGuiness explained, auditing teams may investigate additional questions around COVID-19 data, but any state which uses this framework needs to fill out all parts of the rubric so that results may be easily compared across states. (To anyone who has struggled through a COVID Tracking Project data entry shift, easily comparing across states sounds like a dream.)

The questions asked by this rubric fall into three main categories: collection, reporting, and monitoring. “Collection” questions ask which metrics the state tracked, including test types, case types, outcomes, hospitalization data, and demographic information. “Reporting” questions ask how state public health departments communicated with COVID-19 testing institutions and hospitals to gather data. “Monitoring” questions ask if states took certain steps to identify errors in their data, such as ensuring that COVID-19 test results were coded directly in reports from labs. This section also includes questions about contact tracing.

The questions forming this auditing framework, as well as its overarching goal of sharing best practices and restoring public trust in data, align well with the work of many reporters and researchers documenting the COVID-19 pandemic. However, I have to wonder if this template will seem insufficient before some state auditors even begin investigating. The template refers to only two types of tests, “COVID-19 tests” (which I assume means PCR tests) and “COVID-19 antibody tests.” Antigen tests are not mentioned, nor are any of the several other testing models on track to come on the market this fall. The template also fails to discuss data on any level smaller than a county, neglecting the intense need for district- or ZIP code-level data as schools begin reopening. And it only mentions demographic data—a priority for the many states where the pandemic has widened disparities—in one line.

Plus, just because state auditors can ask additional questions specific to local data issues they’ve seen doesn’t mean that they will. Speaking to State Auditor McGuiness made it clear to me that COVID-19 data decentralization in the U.S. is not going away any time soon. Every state can’t even evaluate its data reporting in the same way, because some auditors’ offices are bound by state legislatures or are unable to access their own states’ public health records.

I look forward to seeing the results of these state data audits and making some comparisons between the 20 states committed to taking part. But to truly understand the scope of U.S. data issues that took place during this pandemic and set up best practices for the future, we need more than independent state audits. We need national oversight and standards.

Sign up for the COVID Data Dispatch newsletter

Leave a Reply