!function(){“use strict”;window.addEventListener(“message”,(function(a){if(void 0!==a.data[“datawrapper-height”])for(var e in a.data[“datawrapper-height”]){var t=document.getElementById(“datawrapper-chart-“+e)||document.querySelector(“iframe[src*=’”+e+”‘]”);t&&(t.style.height=a.data[“datawrapper-height”][e]+”px”)}}))}();
Longtime readers might remember that, back in August, I surveyed the available data on how COVID-19 is impacting American schools.
At the time, very few states were reporting school-specific data, even as school systems around the nation began to reopen for in-person instruction. In that early survey, I highlighted only Iowa as a state including district-level test positivity data on its COVID-19 dashboard. This dearth of data disappointed, but did not surprise me. There was no federal mandate for states, counties, or school districts to report such data, nor did the federal government compile such information.
There is still no federal mandate for school COVID-19 data, despite pleas from politicians and educators alike. So, as school systems across the country close out their fall semesters amidst a growing outbreak and prepare for the spring, I decided to revisit my survey. I sought out to find how many schools are reporting on COVID-19 cases in their K-12 schools, which metrics they are reporting, and how often. To get started with this search, I used the COVID Monitor, a volunteer effort run by Rebekah Jones which is compiling K-12 case counts from government sources and news reports.
Overall, many more states are providing school data now than in August. But the data are spotty and inconsistent; most states simply report case counts, making it difficult to contextualize school infections. (For more on why demoninators are important in analyzing school data, see my October 4 issue.)
In total, 35 states report case counts in all public K-12 schools. 6 states report in an incomplete form, either not including all schools or not including specific case counts.
9 states do not report school COVID-19 data at all. These states are: Alaska, California, Georgia, Nebraska, Nevada, New Mexico, Oklahoma, Pennsylvania, and Wyoming.
Most states update their school data either weekly or biweekly. Only 7 states update daily.
Most states do not report counts of deaths and hospitalizations which are connected to school COVID-19 outbreaks. Only 5 states report deaths (Colorado, Kansas, North Carolina, Kentucky, and Virginia), and only 1 state reports hospitalizations (Kansas).
Only 3 states report in-person enrollment numbers: New York, Massachusetts, and Texas.
New York is the only state to report counts of COVID-19 tests conducted for K-12 students and staff.
And here are a couple of example states I’d like to highlight:
New York has the most complete school data by far, scoring 19 out of a possible 21 points on my index. Not only does the state report enrollment and total tests administered to students and staff, New York’s COVID-19 Report Card dashboard includes the test type (usually PCR) and lab each school is using. Test turnaround times are also reported for some schools. This dashboard should be a model for other states.
Indiana has a dashboard that I like because it is easy to find and navigate. You don’t have to search through PDFs or go to a separate dashboard—simply click on the “Schools” tab at the top of the state’s main COVID-19 data page, and you will see cumulative case counts and a distribution map. Clicking an individual school on the map will cause the dashboard to automatically filter. Indiana also reports race and ethnicity breakdowns for school cases, which I haven’t seen from any other state.
Texas provides detailed spreadsheets with case counts and on-campus enrollments for over 10,000 individual schools. The state reports new cases (in the past week), total cases, and the source of school-related infections (on campus, off campus, and unknown). The infection source data suggests that Texas is prioritizing schools in its contact tracing efforts.
Minnesota is one state which provides incomplete data. The state reports a list of school buildings which have seen 5 or more COVID-19 cases in students or staff during the past 28 days. Specific case counts are not provided, nor are specific dates on when these cases occurred. If I were a Minnesota parent at one of these listed schools, I’m not sure what I’d be able to do with this information beyond demand that my child stay home.
As cases surge across the country, more children become infected, and school opening once again becomes a heated debate from New York City to North Dakota, it is vital that we know how much COVID-19 is actually spreading through classrooms. How can we decide if school opening is a risk to students, teachers, and staff if we don’t know how many students, teachers, and staff have actually gotten sick?
Moreover, how can we understand the severity of this threat without enrollment or testing numbers? Reporting that a single school has seen three cases is like reporting that a single town has seen three cases; the number is worth very little if it cannot be compared to a broader population.
Volunteer sources such as the COVID Monitor and Emily Oster’s COVID-19 School Response Dashboard are able to compile some information, but such work cannot compare to the systemic data collection efforts that national and state governments may undertake. If you live in one of those nine states that doesn’t report any school COVID-19 data, I suggest you get on the phone to your governor and ask why.
Also, speaking of New York City, here’s an update to the 3% threshold I reported on last week:
one segment of my newsletter this week focused on the discrepancies around NYC's infamous school closure benchmark: a weekly average test positivity rate of 3%. thanks to different test reporting methodologies, the city's data hit this mark before the state's did. (1/6) https://t.co/Yp5FFu5Sck
Leading in Crisis briefs: A series of briefs from the Consortium for Policy Research in Education document how 120 principals in 19 states responded to COVID-19 in the spring. The briefs compile analyses, summaries, and recommendations on topics ranging from accountability during school closures to calm during a crisis.
COVID-19 in Congress: GovTrack.us. a project which normally documents bills and resolutions in the U.S. Congress, is currently tracking how COVID-19 has spread through the national legislature. The tracker currently includes 87 legislators who have entered quarantine, tested positive, or come into contact with someone who had been diagnosed with the disease.
On November 18, New York City mayor Bill de Blasio announced that the city’s schools would close until further notice. Students returned to remote learning, while restaurants and bars remain open—even indoor dining is permitted.
This closure came because the city had passed a 3% positivity rate. 3% of all tests conducted in the city in the week leading up to November 18 had returned positive results, indicating to the NYC Department of Health and de Blasio that COVID-19 is spreading rampantly in the community. As a result—and as de Blasio had promised in September—the city’s schools had to close.
But that 3% value is less straightforward than it first appears. In closing schools, de Blasio cited data collected by the NYC Department of Health, which counts new test results on the day that they are collected. The state of New York, however, which controls dining bans and other restrictions, counts new test results on the day that they are reported. Here’s how Joseph Goldstein and Jesse McKinley explain this discrepancy in the New York Times:
So if an infected person goes to a clinic to have his nose swabbed on Monday, that sample is often delivered to a laboratory where it is tested. If those results are reported to the health authorities on Wednesday, the state and city would record it differently. The state would include it with Wednesday’s tally of new cases, while the city would add it to Monday’s column.
Also, the state reports tests in units of test encounters while the city (appears to) report in units of people. (See my September 6 issue for details on these unit differences.) Also, the state includes antigen tests in its count, while the city only includes PCR tests. These small differences in test reporting methodologies can make a sizeable dent in the day-to-day numbers. On the day that Goldstein and McKinley’s piece was published, for example, the city reported an average test positivity rate of 3.09% while the state reported a rate of 2.54% for the city.
Meanwhile, some public health experts have questioned why a test positivity rate would be even used in isolation. The CDC recommends using a combination of test positivity, new cases, and a school’s ability to mitigate virus spread through contact tracing and other efforts. But NYC became fixated on that 3% benchmark; when the benchmark was hit, the schools closed.
Overall, the NYC schools discrepancy is indicative of an American education system that is still not collecting adequate data on how COVID-19 is impacting classrooms—much less using these data in a consistent manner. Science Magazine’s Gretchen Vogel and Jennifer Couzin-Frankel describe how a lack of data has made it difficult for school administrators and public health researchers alike to see where outbreaks are occurring. Conflicting scientific evidence on how children transmit the coronavirus hasn’t helped, either.
Emily Oster, a Brown University economist whom I interviewed back in October, continues to run one of a few comprehensive data sources on COVID-19 in schools. Oster has faced criticism for her dashboard’s failure to include a diverse survey population and for speaking as an expert on school transmission when she doesn’t have a background in epidemiology. Still, CDC Director Robert Redfield recently cited this dashboard at a White House Coronavirus Task Force briefing—demonstrating the need for more complete and trustworthy data on the topic. The COVID Monitor, another volunteer dashboard led by former Florida official Rebekah Jones, covers over 240,000 K-12 schools but does not include testing or enrollment numbers.
For me, at least, the NYC schools discrepancy has been a reminder to get back on the schools beat. Next week, I will be conducting a review of every state’s COVID-19 school data—including which metrics are reported and what benchmarks the state uses to declare schools open or closed. If there are other specific questions you’d like me to consider, shoot me an email or let me know in the comments.
The COVID Tracking Project’s City Dataset: For about five months, COVID Tracking Project volunteers collected and standardized data from highly populated cities and counties across the country. The project started in late May with the intent of tracking COVID-19 in cities that were seeing large Black Lives Matter protests, then expanded following the Sunbelt surge in June. 65 cities and counties are included in the dataset; you can read more about the dataset’s methodology and major findings on the Project’s blog.
COVID-19 State and County Policy Orders: This dataset includes over 3,000 COVID-19-related policies at the state and county level addressing public health, local economies, and disparities during the pandemic. Data were compiled by students at the Boston University School of Public Health and other student volunteers, and the dataset is housed at the HHS.
The COVID-19 School Response Dashboard has surveyed nearly 1,000 individual schools and districts on their enrollments, case counts, and COVID-19 mitigation strategies. Screenshot retrieved on October 3.
The data sources on COVID-19 in U.S. K-12 schools vary widely, but most of them have one thing in common: they only report case counts.
Texas rescinded school data this past week due to errors. Florida recently began publishing school reports—which list out cases by individual school while failing to provide cumulative totals. But a larger problem for these states and others is that, when case numbers are reported in isolation, there is no way to compare outbreaks at different locations.
Imagine, for example, that you only knew that Wisconsin had seen 18,000 cases in the past week, while Texas had seen 28,000. You would assume that Texas is currently in more dire straights, with more people infected. But adjust for population—divide those case numbers by the populations of both states—and you find that Texas has an infection rate of about 95 people per 100,000 Texans, while Wisconsin has a rate of about 303 people per 100,000, over three times higher. Texas is slowly recovering from its summer outbreak, while Wisconsin is an outbreak site of major national concern.
In the case of school data, enrollment numbers are the key to these comparisons. Knowing how many students are infected in your district may be useful, but unless you know how many students are actually going into school buildings on a regular basis, it is difficult to translate the case numbers into actionable conclusions. The majority of states which report school COVID-19 data do not report such numbers, and even those that do may have incomplete data. New York’s dashboard, for example, currently reports 0 staff members in the New York City school district, which opened for in-person instruction last week.
Volunteer datasets similarly focus on case numbers. The National Education Association School and Campus COVID-19 Reporting Site, built from the crowdsourced spreadsheet of Kansas high school teacher Alisha Morris, compiles case counts from news outlets and volunteer reports. The COVID Monitor, a school dashboard produced by Rebekah Jones’ Florida COVID Action project, combines news and volunteer reporting with state-reported numbers. Both of these efforts are incredibly comprehensive in documenting where COVID-19 is impacting students and teachers, but without enrollment numbers for the schools, it is difficult to use the data for meaningful comparison.
Even the New York Times focuses on case counts. The Times’ review of school COVID-19 cases found extremely scattered public reporting, but the paper failed to include any denominators—not even the county case counts which this paper has been tracking since early in the pandemic. Alexander Russo, columnist at education journal Phi Delta Kappan and friend of this newsletter, recently commented on how such cases-only reporting may paint a misleading picture of the pandemic’s impact.
Clearly, we need denominators for our case counts. And a new dataset is out to provide this crucial metric. Emily Oster, Professor of Economics and Public Policy at Brown University, collaborated with software company Qualtrics and several national education associations to build a COVID-19 school dashboard which focuses on case rates, not counts.
This project sources data by directly surveying schools every two weeks, rather than relying on sporadic news and volunteer reports. And it includes information about school reopening plans and mitigation strategies, such as whether masks, increased ventilation, and symptom screenings are in use. As of the dataset’s most recent update (for the two-week period of September 14 to 27), 962 schools in 47 states are included. These schools report an average student infection rate (confirmed and suspected cases) of 0.62% and an average staff infection rate of 0.72%; both rates are up from 0.51% and 0.5%, respectively, in the previous two weeks. For more initial findings, see this NPR feature on the dashboard, published on September 23.
I spoke to Oster this past Tuesday, only four days after the dashboard’s public release. She explained more detail about the project’s methodology and her future plans for tracking COVID-19 in schools. (This interview has been lightly edited and condensed for clarity.)
Interview
Betsy Ladyzhets: What is your background in data and education reporting? What have you been working on during the COVID-19 pandemic that led you to this dashboard?
Emily Oster: I am, by training, an economist, so I have a lot of background in data analysis and some in data collection. But most of my work, virtually all of my work has been on health, not on education. I have written a couple of books on pregnancy and parenting, so I have this audience of parents. And during the pandemic, I was writing a lot about kids and COVID. And then that led me to be interested in issues around schools, and putting together this team to do the data collection for the dashboard.
BL: Who else is on the team?
EO: The partnership—the primary people who are doing the work and analysis—is Qualtrics, which is a tech company. And then, there are a number of educational association groups. The School Superintendents Association, the National Association of Elementary School Principals, the National Association of Secondary School Principals, that was the initial core team. Then, we’ve got a lot of distribution help from the charter school alliance, from a bunch of the independent schools associations. A lot of different educational groups have done distribution work.
BL: How did you develop partnerships with these different education groups?
EO: I had expressed in some public forum that I thought there should be more of this data collection, and someone from Qualtrics reached out and said, “We think there should be more of this, too. Maybe we can help.” And around this time, I was connected with a woman at the school superintendents association, who also said, “I think we should do this, maybe we can help.” Those were the two key pieces, and it came together from there.
BL: Yeah, it’s good to have—it seems like a very useful partnership, that you have the tech expertise but also the people who are actually interacting with teachers and students.
EO: Yeah. I think our biggest get for the dashboard, and what is potentially useful about it, is that we start at the school level. We know what the schools are doing. We’re in direct contact with them.
BL: I know from poking around the dashboard and reading the NPR article that the way you’re gathering data is with that direct interface, surveying schools. Why did you choose this method as opposed to looking at news articles or compiling data from public sources?
EO: It was really important for us to understand the context around school reopening before we asked about the COVID cases. We really wanted to know: how many kids do you have in school, are they actually in school or are they virtual, what kind of enrollment do you have? And also, what are you doing as mitigation? To come, ultimately, to understand what’s happening with cases, we really need to start by understanding, like, are you wearing masks? Are you distancing? Are you doing all of these things? So then, if we do see cases, we can go back and look and say okay, can we make any conclusions about which of these precautions are helping.
In particular, these enrollment numbers give us the ability to say something about not just cases, but rates. To be able to say, this is the share of people that are infected. Which I think is a very important number, and arguably more important for decision-making, than counts.
BL: Yeah, I was going to ask about that. Your dashboard, unlike a couple of other school COVID data projects, actually has denominators, so that you can compare case rates.
EO: That’s our thing. That’s our whole pitch. We have denominators.
BL: Why is it so important to have denominators?
EO: I think the importance of denominators is, it tells you something about the actual risk of encountering someone with COVID… If you’re going to send your kid off to school, and if you’re going to send your kid to a school of 1,200 people, I think it is useful to understand—are there likely to be 100 kids in the building with COVID? Is there likely to be one kid in the building with COVID?
And similarly, thinking about the risk to your kid, if your kid is going to be in the building for two weeks, what’s the average experience? Is there a ten percent chance they’re going to get the coronavirus over these two weeks? Is there a one percent chance? I think that that is the thing we should be making decisions on. We really need those denominators to get the rate.
BL: Absolutely. Could you tell me more about how the surveys work? What questions you’re asking, and how often you’re collecting data?
EO: There’s two different avenues for data collection… First, if you’re an individual school, then the way we’re collecting the data is that you enroll in a baseline survey on Qualtrics. We ask you about your enrollment, your opening model, what share of your kids are in person, how many staff you have, are they in person. And then, if you have in-person instruction, we ask you about masking and distancing, what you’re doing on those conventions. And then we ask maybe one or two demographic questions, like are you doing free or reduced-price lunch, or financial aid if it’s a private school.
That [initial survey] is followed up every other week with a survey that is very short. It’s basically, how many confirmed and suspected cases do you have in students and staff, and then [we ask schools to] confirm their in-person enrollment, just to see if there have been large changes in the opening model.
And then, on the district side, we’re asking all the same questions, but—in the case of the districts, there are a number where [superintendents] have said, “We’d like to enroll our entire school district in your thing, and we’re going to give you all of our data.” When we do that, we’re actually collecting the data internally in Excel. We send them an Excel sheet with their schools, they fill out that same information [as in the school survey], and then we come back again biweekly and ask them those same questions. It’s the same information, it’s just that rather than making them go through 25 versions of the same Qualtrics survey, we have it all in one.
BL: What mechanisms do you have in place for flagging errors? I know that’s a concern with this kind of manual back and forth.
EO: On the district side, there’s a cleaning procedure. When the surveys in, obviously we don’t change them, but we look them over. If there’s something that’s wrong, like the number of COVID cases is greater than the number of people, or they’ve reported three billion students enrolled, we go back to the district and ask, “Can you look at this?”
Then, on the individual school side, there’s a bunch of validation built into the Qualtrics survey operation. And we have some procedures which we’re working on ramping up which are going to do a little bit of hand lookup, just to make sure that we’re getting valid data.
BL: What is your sample of schools like so far? Is there a particular area, like any states or types of schools that you have more complete data so far, or any areas where you’re prioritizing in trying to get them to take the surveys?
EO: We’re an equal opportunity prioritizer. We’ll take anybody. There are a few states where we have better representation of private schools, because [private school associations are involved in roll-out]. We have more schools in Washington than elsewhere.
Particularly on the public school side, we’re very concerned about enrolling entire districts. That’s the easiest thing for us, it’s the most robust. It is also—we think it provides the most service to the district. And so we are spending a lot of time doing outreach to states and to districts, trying to get people to encourage their districts to enroll.
BL: Another thing I’m curious about is testing. Just yesterday, the Trump administration announced that they’re going to deploy 150 million rapid antigen tests around the country, once they’re made by Abbott, and they’re going to focus on getting those tests to students and teachers. Is testing something that you’re thinking about tracking?
EO: Yeah. We ask [the schools], are you doing any routine testing of anybody, and most them say they’re not. But I think it would be very interesting to incorporate. Part of my hope for this project is that, over time, as we get more people enrolled and we get more of a rhythm of reaching people routinely, that there will be questions we can add. We’ll potentially get to a place where we’ll say, “Okay, now, a bunch of districts are doing testing, let’s put that in.” And we’ll try to figure out, how common is that, and who’s doing it.
BL: There are also states that are reporting COVID data in schools. I know New York has a dashboard, that’s pretty extensive, while other states report numbers by county or district or just overall. Is your project doing anything with those public data, or with other volunteer projects that track COVID in schools?
EO: Not at the moment. I think that we are eager to—there are a number of states that have very good dashboards, and our goal, one of the things we are working on is, how can we basically pull that in? One of the issues is that most of those dashboards just report cases, and so in order to pull them into what we’re doing, we need to go behind this and say, okay, we need to go behind and actually figure out what the initial enrollments were.
BL: Which states do you think are doing the best job so far?
EO: I mean, New York’s is pretty good. Tennessee has a pretty good dashboard. South Carolina. There’s a few.
BL: I know New York is one—I think it’s the only one that has both testing numbers and enrollment numbers. (Editor’s note: I checked; this is true.)
EO: Exactly.
BL: Last question: how do you expect the dashboard to be utilized in future research, and are you seeing any applications of it so far?
EO: No, it’s literally been, like, four days. My guess is that we will see more—we’ll see some usage by districts, as they try to think about opening, that’s the first use case. Just districts that are opening, trying to think about what’s the right thing to do. My guess is that, in the long run, maybe we’ll see some research with this. That isn’t the goal of the project, but we’ll see.
BL The focus is on helping districts compare to each other.
EO: Exactly, yeah.
Analysis
I’m excited about this dashboard. First of all, it can’t be overstated: denominators are huge. Knowing that the estimated infection rate of K-12 students in the U.S. is under one percent is so much more useful from a decision-making standpoint than the actual number of cases.
Second, the school survey model is a novel method with advantages for one specific group: the very schools included in this dataset. This dashboard is not particularly useful for me, a COVID-19 journalist, right now; its sample size is small, and the data are not currently available for download by outside users. (Oster told me that she is planning to set up a validation feature, so that she and other partners on this project can track how their data are being used.) But the school administrators who fill out the project’s biweekly surveys will be able to see COVID-19 trends for their students and staff, compared to trends at other schools across the country. They are essentially getting free consulting on their school reopening plans.
I have one major concern, however. As Oster explained in our interview, the dashboard currently includes an abundance of private and charter schools in its sample, due to partnerships with private and charter school associations.
According to Education Week, public schools made up 70% of American schools in 2017-2018. In Oster’s dashboard, these schools are 67% of the sample size, while private, charter, and religious schools make up the rest of the sample. At a glance, this seems fairly representative of the country’s school demographics. However, the average public school has far more students than the average private school; without seeing the actual enrollment numbers of the schools included in this dashboard, it is difficult to determine how balanced the dashboard’s sample truly is.
In addition, the dataset’s sample so far shows a bias for suburban schools. The schools surveyed are 37% suburban, 28% rural, 26% urban, and 8% town. Suburban school districts tend to receive more funding than urban districts, and suburban districts are historically sites of school segregation. Finally, this dataset so far heavily represents private schools in Washington, with 106 schools, over 10% of the sample, coming from this state. West Virginia, Alabama, and Mississippi, all more rural states which rank in the bottom ten in U.S. News & World Report’s education rankings, are so far not represented at all.
A recent New Yorker article by Alec MacGillis draws attention to the low-income students of color who may be left behind in this era of remote learning. Students whose parents and guardians need to continue working outside the home, or otherwise do not have the resources to support kids with an array of Zoom links and homework platforms, may lose a year of education if their schools don’t reopen—and yet these students and their families are more vulnerable to COVID-19 if they do go back in person.
The schools which serve low-income minority communities are likely to need this dashboard more than any others. And yet these very schools may be left out of data collection, as their principals and superintendents may not have the bandwidth to fill out even the simplest survey. Extra effort could be needed to ensure that crucial schools are not left behind. The COVID-19 School Response Dashboard, and other future school data sources, must prioritize diversity in their data collection if they are to be truly complete.
The CDC was busy last week. In addition to their vaccination playbook, the agency released indicators for COVID-19 in schools intended to help school administrators make decisions about the safety of in-person learning. The indicators provide a five-tier system, from “lowest risk of transmission” (under 5 cases per 100,000 people, under 3% test positivity) to “highest risk” (over 200 cases per 100,000 people, over 10% test positivity). It is unclear what utility these guidelines will have for the many school districts that have already started their fall semesters, but, uh, maybe New York City can use them?
Speaking of New York: the state’s dashboard on COVID-19 in schools that I described in last week’s issue is now live. Users can search for a specific school district, then view case and test numbers for that district’s students and staff. At least, they should be able to; many districts, including New York City, are not yet reporting data. (The NYC district page reports zeros for all values as of my sending this issue.)
Los Angeles Unified, the nation’s second-largest school district, is building its own dashboard, the Los Angeles Times reported last week. The district plans to open for in-person instruction in November or later, at which point all students and staff will be tested for COVID-19. Test results down to the classroom level will be available on a public dashboard.
Wisconsin journalists have stepped in to monitor COVID-19 outbreaks in schools, as the state has so far failed to report these data. A public dashboard available via the Milwaukee Journal Sentinel and the USA Today Network allows users to see case counts and resulting quarantine and cleaning actions at K-12 schools across the state. Wisconsin residents can submit additional cases through a Google form.
According to the COVID Monitor, states that report K-12 COVID-19 case counts now include: Arkansas, Hawaii, Kentucky, Louisiana, Mississippi, New Hampshire, Ohio, South Carolina, South Dakota, Tennessee, Texas, and Utah. Some of these state reports are far more precise than others; Texas and Utah, for example, both report only total case counts. The COVID Monitor reports over 10,000 COVID-19 confirmed cases in K-12 schools as of September 20, with another 17,000 reported cases pending.
A recent article in the Chronicle of Higher Education by Michael Vasquez explains common issues with reporting COVID-19 cases on college and university campuses: inconsistencies across school dashboards, administrations unwilling to report data, and other challenges.
New York Governor Andrew Cuomo discussed this dashboard in his online briefing last Tuesday, September 8. (If you’d like to watch, start at about 18:00.) He explained that every school district is now required to report test and case numbers daily to New York’s Department of Health. Local public health departments and state labs performing testing are also required to report these numbers, so that the state department can cross-check against three different sources. Cases and tests will be published by school on the new dashboard, called the COVID Report Card.
In his briefing, Governor Cuomo showed a mockup of what the Report Card will look like. The available data includes positive cases by date, tests administered by the school (including test type, lab used, and test wait time), the school’s opening status (i.e. is it operating remotely, in person, or with a hybrid model), and the percentage of on-site students and staff who test positive.
This dataset promises to be much more complete than any other state’s reporting on COVID-19 in schools. But I haven’t been able to closely examine these data yet, because the dashboard has yet to come online.
According to reporting from Gothamist, state officials planned for the dashboard to begin showing data on September 9. As I send this newsletter on September 13, however, the dashboard provides only a message stating that the COVID Report Card will be live “when the reporting starts to come back.”
“The facts give people comfort,” Governor Cuomo said in his briefing. So, Governor, where are the facts? Where are the data? When will New York students, parents, and teachers be able to follow COVID-19 in their schools? My calls to Governor Cuomo’s office and the New York State Department of Health have as yet gone unanswered, and subsequent press releases have not issued updates on the status of these data.
I hope to return with an update on this dashboard next week. In the meantime, for a thorough look at why school COVID-19 data are so important and the barriers that such data collection has faced so far, I highly recommend this POLITICO feature by Biana Quilantan and Dan Goldberg.
Since last week’s issue, four more forms of official state reporting on COVID-19 in schools have come to my attention:
New Hampshire is publishing school-associated case data, including active cases, recovered cases, and outbreak status (not clearly defined) on a page of the state’s dashboard, updated daily.
Mississippi is publishing a weekly report on cases, quarantines, and outbreaks among students, teachers, and staff, aggregated by county. So far, the state has released reports on the weeks ending August 21 and August 28.
Hawaii’s state Department of Education is publishing a page on COVID-19 in the school district, updated weekly. (Did you know that the entire state of Hawaii is comprised of one school district?)
New York is launching a public dashboard on COVID-19 in schools; this dashboard will be available starting on September 9. So far, the page states that, “New York school districts will be required to provide the Department of Health with daily data on the number of people who have tested positive for COVID-19 beginning Tuesday, September 8th.” Last week, Mayor Bill de Blasio announced that classes in New York City would be delayed by two weeks to allow for more extensive safety precautions.
In addition, the nonprofit civic data initiative USAFacts has compiled a dataset of reopening plans in America’s 225 largest public school districts. The dataset classifies reopening plans as online, hybrid, in-person, or other, with information as of August 17.
Meanwhile, on the higher education front:
Education reporter (and friend of this newsletter!) Benjy Renton has launched a dashboard keeping track of COVID-19 outbreaks on college and university campuses. The dashboard organizes outbreaks according to their alert level, based on new cases in the past week.
I am continuing to monitor the COVID-19 metrics reported by college and university dashboards in my comparison spreadsheet. I haven’t had the chance to expand this analysis much in the past week, but it continues to be an ongoing project.
Alisha Morris’ crowd-sourced spreadsheet of COVID-19 cases, closures, and other reopening plan changes in K-12 schools has leveled up, thanks to collaboration with the National Education Association. Concerned students, parents, and teachers can now explore the data and report cases on a new NEA-run website. This NPR article explains more of the project’s background and findings.
Brown University and national education organizations are working together to track school pandemic responses. Their findings, planned to be released in a dashboard this September, will include: enrollment, poverty level, internet access, COVID-19 cases, and district opening status.
Duval County Public Schools, a district in Florida, was barred from reporting school-related COVID-19 cases by the county’s public health department. According to the Florida Times-Union, the county stated that the school district could not publish “school specific data related to COVID-19” without the state health department’s permission.
Meanwhile, in Kentucky: this state’s COVID-19 dashboard now includes both a K-12 school public health report and a college and university public health report. Both reports, published daily, include active cases, deaths, and other statistics.
Politicians on both sides of the aisle, along with public health leaders such as the CDC’s Dr. Robert Redfield and NIAID’s Dr. Anthony Fauci, agree that returning to in-person learning is crucial for public health. Many children rely on food and health resources provided by schools. Parents rely on childcare. Without in-person schools, it is difficult for teachers and other mandated reporters to identify cases of child abuse. And all school students, from kindergarteners to college kids, are facing the mental health deterioration that comes from limited social interaction with their peers.
But in deciding whether and how to return to in-person learning, school districts around the country are facing the same challenge that states faced early in the pandemic: they’re on their own. Some districts may have guidance from local government; in New York, for example, schools are allowed to reopen if they are located in an area with a under 5% of COVID-19 tests returning positive results. Every county in the state meets this guideline, and the state as a whole has had a positivity rate under 1% for weeks.
Still, low community transmission does not indicate that a state is necessarily safe for reopening. Teachers in New York City have protested the city’s plan for school reopening, citing poor ventilation, no plan for regular testing, and other health concerns. Teachers in Detroit, outside of Phoenix, and other districts across the country are considering strikes. Earlier this week, the White House formally declared that teachers are essential workers—meaning they could continue working after exposure to COVID-19—which Randi Weingarten, president of the American Federation of Teachers, called a move to “threaten, bully, and coerce” teachers back into their unsafe classrooms. Meanwhile, many colleges and universities are planning to bring students from out of state into the communities around their campuses.
As conversations on school reopening heighten at both national and local levels, a data journalist like myself has to ask: what data do we have on the topic? Is it possible to track how school reopening is impacting COVID-19 outbreaks, or vice versa?
The answer is, as with any national question about COVID-19, the data are spotty. It’s possible to track cases and deaths at the county level, but no source comprehensively tracks testing at a level more local than the state. It is impossible to compare percent positivity rates—that crucial metric many districts are using to determine whether they can safely reopen—both broadly and precisely across the country.
The best a data journalist can do is represented in this New York Times analysis. The Times pulled together county-level data from local public health departments and evaluated whether schools in each county could safely open based on new cases per 100,000 people and test positivity rates. Test positivity rates are difficult to standardize across states, however, because different states report their tests in different units. And, if you look closely at this story’s interactive map, you’ll find that some states—such as Ohio, New Hampshire, and Utah—are not reporting testing data at the county level at all.
Still, some research projects and volunteer efforts are cropping up to document COVID-19 in schools as best they can. I will outline the data sources I’ve found here, and I invite readers to send me any similar sources that I’ve missed so that I can feature them in future issues.
How schools are reopening
COVID-19 Testing in US Colleges: Sina Booeshaghi and Lior Pachter, two researchers from CalTech, put together a database documenting testing plans at over 500 colleges and universities throughout the U.S. The database is open for updates; anyone who would like to suggest an edit or contribute testing information on a new school can contact the researchers, whose emails are listed in the spreadsheet. Booeshaghi and Pachter wrote a paper on their findings, which is available in preprint form on medRxiv (it has not yet been reviewed by other scientists).
The College Crisis Initiative: Davidson College’s College Crisis Initiative (or C2i) maps out fall 2020 plans for about 3,000 colleges and universities. Clicking on a college in the interactive map leads users to see a brief description of the school’s opening policy, along with a link to the school’s website. Corrections may be submitted via a Google form.
District Budget Decisions: Edunomics Lab at Georgetown University has compiled a database of choices school districts are making about how to change their budgets and hiring during the COVID-19 pandemic. The database includes 302 districts at the time I send this newsletter; district choices are categorized as budget trimming, salary reductions, benefits adjustment, furloughs, and layoffs.
Reporting COVID-19 in schools and districts
COVID-19 in Iowa: Iowa’s state dashboard includes a page which specifically allows users to check the test positivity rates in the state’s school districts. Click a school district in the table on the left, and the table on the right will automatically filter to show how testing is progressing in the counties encompassed by this district. So far, Iowa is the only state to make such data available in an accessible manner; other states should follow its lead.
NYT COVID-19 cases in colleges: Journalists at the New York Times surveyed public and private four-year colleges in late July. The analysis found at least 6,600 cases tied to 270 colleges since March. This dataset is not being actively updated, but it is an informative indicator of the schools that faced outbreaks in the spring and summer.
Individual school dashboards: Any large college or university that chooses to reopen, even in a partial capacity, must inform its students of COVID-19’s progress on campus. Some schools are communicating through regular emails, while others have put together school-specific dashboards for students, professors, and staff. Two examples of school dashboards can be found at Boston University and West Virginia University; at other schools, such as Georgia Tech, students have spun up their own dashboards based on school reports.
Reopening gone wrong
K-12 school closures, quarantines, and/or deaths: Weeks ago, Alisha Morris, a theater teacher in Kansas, started compiling news reports on instances of COVID-19 causing schools to stall or alter reopening plans. Morris’ project grew into a national spreadsheet with hundreds of COVID-19 school case reports spanning every U.S. state. She now manages the sheet with other volunteers, and the sheet’s “Home” tab advertises a new site coming soon. You can explore the dataset through a Tableau dashboard created by one volunteer.
Datasets under development
FinMango and Florida COVID Action collaboration: FinMango, a global nonprofit which has pivoted to help COVID-19 researchers, has partnered with Florida COVID Action, a data project led by whistleblower Rebekah Jones, to track COVID-19 cases in K-12 schools. The project, called the COVID Monitor, has already been compiling reports from media and members of the public since July. It includes about 1,300 schools with confirmed or reported COVID-19 cases so far, 200 of which are in the project’s home state of Florida.
ProPublica school reopening survey: A new initiative from ProPublica asks students, parents, educators, and staff to report on their schools’ reopening plans. Readers who might prefer to share information with ProPublica through more private means can get in touch on Signal or visit the publication’s tips page.
Nature university reopening survey: Similarly to ProPublica,Nature News is surveying its readers on their reopening experience. This survey specifically calls on research scientists to share how they will be teaching and if they agree with the approach their university has taken on reopening. Respondents who wish for more privacy can use Signal or WhatsApp.