Science’s COVID-19 reporting is supported by the Heising-Simons Foundation.
Sixteen pandemic months have felt disorienting and arduous—but along the arc of human history, COVID-19 marks just another inflection point. Epidemics have punctuated humanity’s timeline for centuries, sowing panic and killing millions, whether the culprit was plague, smallpox, or influenza. And when infections abate, their imprints on society can remain, some short-lived and some enduring.
In a series of news articles over the coming months, Science will consider how a new normal is emerging in the scientific world. Of course, COVID-19 is still with us, especially outside the minority of countries now enjoying the fruits of widespread vaccination. Still, as the pandemic enters a different phase, we ask how research may be changing, how scientists are navigating these waters, and in what directions they are choosing to sail.
Although the past may not presage the future, epidemic history illuminates how change unfolds. “Historians often say that what an epidemic will do is expose underlying fault lines,” says Erica Charters, a historian of medicine at the University of Oxford who is studying how epidemics end. But how we respond is up to us. “When we ask, ‘How does the epidemic change society?’ it suggests there’s something in the disease that will guide us. But the disease doesn’t have agency the way humans do.”
Past epidemics have spurred scientists and physicians to reconsider everything from their understanding of disease to their modes of communication. One of the most studied, the bubonic plague, tore through Europe in the late 1340s as the Black Death, then sporadically struck parts of Europe, Asia, and North Africa over the next 500 years. Caused by bacteria transmitted via the bites of infected fleas, the plague’s hallmarks included grotesquely swollen lymph nodes, seizures, and organ failure. Cities were powerless against its spread. In 1630, nearly half the population of Milan perished. In Marseille, France, in 1720, 60,000 died.
Yet the mere recording of those numbers underscores how medicine reoriented in the face of the plague. Until the Black Death, medical writers did not routinely categorize distinct diseases, and instead often presented illness as a generalized physical disequilibrium. “Diseases were not fixed entities,” writes Frank Snowden, a historian of medicine at Yale University, in his book Epidemics and Society: From the Black Death to the Present. “Influenza could morph into dysentery.”
The plague years sparked more systematic study of infectious diseases and spawned a new genre of writing: plague treatises, ranging from pithy pamphlets on quarantines to lengthy catalogs of potential treatments. The treatises cropped up across the Islamic world and Europe, says Nükhet Varlık, a historian of medicine at Rutgers University, Newark. “This is the first disease that gets its own literature,” she says. Disease-specific commentary expanded to address other conditions, such as sleeping sickness and smallpox. Even before the invention of the printing press, the treatises were apparently shared. Ottoman plague treatises often contained notes in the margins from physicians commenting on this or that treatment.
Plague and later epidemics also coincided with the rise of epidemiology and public health as disciplines, although some historians question whether the diseases were always the impetus. From the 14th to 16th centuries, new laws in the Ottoman Empire and parts of Europe required collection of death tolls during epidemics, Varlık says. Plague also hastened the development of preventive tools, including separate quarantine hospitals, social distancing measures, and, by the late 16th century, contact-tracing procedures, says Samuel Cohn, a historian of the Middle Ages and medicine at the University of Glasgow. “All of these things that a lot of people think are very modern … were being devised and developed” back then. The term “contagio” took off, as officials and physicians sought to ascertain how plague was spread.
Cholera, caused by a bacterium in water, devastated New York and other areas in the 1800s. It gave rise not only to new sanitation practices, but also to enduring public health institutions. “Statistics had proven what common sense had already known: In any epidemic, those who had the faintest chance of surviving were those who lived in the worst conditions,” historian of medicine Charles Rosenberg, now an emeritus professor at Harvard University, wrote in his influential book The Cholera Years: The United States in 1832, 1849, and 1866. To improve those conditions, New York City created its Metropolitan Board of Health in 1866. In 1851, the French government organized the first in a series of International Sanitary Conferences that would span nearly 90 years and help guide the founding of the World Health Organization in 1948. Cholera “was the stimulus for the first international meetings and cooperation on public health,” Rosenberg says now.
Meanwhile, efforts to decipher disease continued: Although physicians who eyed germs as culprits remained a minority in the mid-1800s, disease “was no longer an incident in a drama of moral choice and spiritual salvation,” but “a consequence of man’s interaction with his environment,” Rosenberg wrote. Fleas were identified as the carrier of plague during a global pandemic in the late 1800s and early 1900s, and the concept of insects as vectors of disease has influenced public health and epidemiology ever since.
A curious mix of remembering and forgetting trails many epidemics. Some quickly vanish from memory, says David Barnes, a historian of medicine at the University of Pennsylvania. The 1918 flu, which killed an estimated 50 million people worldwide but was also overshadowed by World War I, is a classic example of a forgotten ordeal, he says. “One would expect that that would be a revolutionary, transformative trauma, and yet very little changed” in its wake. There was no vast investment in public health infrastructure, no mammoth infusion of money into biomedical research. Although the 1918 pandemic did help spur a new field of virology, that research advanced slowly until the electron microscope arrived in the early 1930s.
In contrast, the emergence of HIV/AIDS in the 1980s left a potent legacy, Barnes says. A new breed of patient-activists fought doggedly for their own survival, demanding rapid access to experimental treatments. They ultimately won the battle, reshaping policies for subsequent drug approvals. But, “It wasn’t the epidemic per se—the damage, the death toll of AIDS—that made that happen,” Barnes says. “It was activists who were organized and persistent, really beyond anything our society had ever seen.”
It’s through this lens of human agency that Barnes and other historians contemplate COVID-19’s potential scientific legacy. The pandemic, like its predecessors, cast light on uncomfortable truths, ranging from the impact of societal inequities on health to waste in clinical trials to paltry investments in public health. Questions loom about how to buttress labs—financially or otherwise—that were immobilized by the pandemic.
In COVID-19’s wake, will researchers refashion what they study and how they work, potentially accelerating changes already underway? Or will what Snowden calls “societal amnesia” set in, fueled by the craving to leave a pandemic behind? The answers will come over decades. But scientists are beginning to shape them now.