The intense debates over school openings are missing something crucial: numbers. Without them, it's essentially impossible to know what to do, or to evaluate what is being proposed.
Here's an analogy. Suppose that the Food and Drug Administration is contemplating a new food safety regulation, or that the Department of Transportation is considering new restrictions on railroads. The White House Office of Information and Regulatory Affairs is supposed to require it to identify the gains and the losses — the benefits and the costs.(1)
Those numbers might not be decisive, but they're needed. In their absence, the decision whether to proceed, or not to proceed, is essentially a stab in the dark.
To be sure, some numbers might be hard to specify. The agencies might not know enough to provide them. But officials have well-established techniques for dealing with that problem. For example, agencies might be asked to disclose the ranges, including the best and worst cases, and their respective likelihoods.
It's true that politics might intervene, and you might not be able to trust the numbers. But when the system is working well, they are checked and rechecked by people who know what they are doing, and aren't affected by political considerations.
The decision whether and how to reopen schools is being made by states and localities, not by Washington, and numbers need to inform those choices. The problem is that for school openings (and much more), we're mostly hearing abstractions and generalities — expressions of agitation and fear.
On the one hand, reasonable people are pointing to the immense strain on parents of having young kids at home and the many problems with online learning. On the other hand, reasonable people (including teachers' unions) are pointing to the risk of an outbreak and a spike in deaths.
In the abstract, these are legitimate concerns. For many school systems, there are going to be trade-offs here. But numbers could make apparently hard questions much easier to answer and could help depoliticize the process.
Imagine, for example, a school district in which the number of community infections is very low, and in which real experts (epidemiologists and others, not politicians or those influenced by them) say: "With appropriate precautions, the risk of a real outbreak is vanishingly small, and we're highly unlikely to lose any lives as a result of opening." In such a district, opening the schools is a no-brainer.
By contrast, imagine a district in which the number of community infections is not low, and in which the experts say: "Even with appropriate precautions, the risk of a real outbreak is significant, and over the course of the school year, we're likely to lose at least 50 lives as a result of opening." Opening the schools would seem to be a mistake.
In Massachusetts, officials have reportedly moved in the direction of using numbers, with guidance that relies on how much the coronavirus is spreading in relevant districts. Color-coded maps specify whether the risk of spreading is "low," "moderate" or "high," based on recent infection rates. If a district is low risk, officials will apparently recommend full-time in-person instruction. If the risk is moderate or high, a district might consider remote-only or some hybrid model.
That's progress. It's a lot better than pure guesswork. But is it right?
To know, we would need to do at least three things. First, be very clear on the meaning of low, moderate and high. Second, understand the incremental public-health risk if a school district opens, given the specific category into which it falls. Third, turn that incremental risk into the relevant numbers, which include infections and deaths.
It's possible, of course, that public-health specialists, in Massachusetts and elsewhere, have done or are trying to do all of that. It is also possible that it's tough to produce the relevant numbers; epidemiologists and others might insist that they would depend on a lot of speculation. For example: How many schoolchildren will end up respecting the protocols? If a large number of them don't, what's the incremental risk?
In the world of regulation, hard or unanswerable questions are not unfamiliar. The experts typically develop scenarios, based on optimistic and pessimistic assumptions. Armed with that information, policymakers are often in a good position to know whether to proceed.
Of course, school districts can maintain flexibility. Some of them might allow full-time in-person classes in the hope that the optimistic assumptions are right. But if those assumptions turn out to be wrong, and if infections spike, districts need a plan to shift to online learning, perhaps in a hurry. (A relevant saying: "If you make a plan, God laughs. If you make two plans, God smiles.")
As the school year begins, some of the trade-offs might call for a political judgment, informed but not determined by the numbers. Suppose that the experts say this: "If you open, you're unlikely to have anything like an outbreak. But there will be more infections than there would be if learning were online; and over the course of the year, some number of people will die. That number will be small — but above zero."
If that's what they say, the choice might turn out to be very difficult. But here, too, it's not unfamiliar. When we regulate or don't regulate automobiles, and when we regulate or don't regulate air pollution, we are making similarly difficult choices.
To make those choices sensibly, and to promote accountability, we need numbers. Let's get more of them.
(1) I served as administrator of that office under President Barack Obama.
Cass R. Sunstein is a Bloomberg Opinion columnist. He is the author of "The Cost-Benefit Revolution" and a co-author of "Nudge: Improving Decisions About Health, Wealth and Happiness." This piece was written for Bloomberg Opinion.
A note to our community:
As a public service, this article is available for all. Newsday readers support our strong local journalism by subscribing. Please show you value this important work by becoming a subscriber now.SUBSCRIBE