Extract: A Shot to Save the World by Gregory Zuckerman

This entry was posted on 28 January 2022.

This is the definitive account of the global effort to develop a vaccine for Covid-19, charting the failure and success of every major vaccine in use. A number-one New York Times bestselling author and award-winning Wall Street Journal investigative journalist, Zuckerman takes us inside the
top-secret laboratories, corporate clashes and high-stakes government negotiations that led to effective vaccines development and roll out.
A Shot to Save the World is the story of how science saved the world.

 


 

1

 

1979– 1987

 

Before and after. The epidemic would cleave lives in two,

the way a great war or depression presents a commonly

understood point of reference around which an entire

society defines itself.

— Randy Shilts, And the Band Played On

 

Henry Masur was getting desperate.

The young man before him was short of breath, feverish,

and couldn’t stop coughing. Masur, a thirty-three year-

old in his first week as an attending physician at New

York Hospital on Manhattan’s Upper East Side, ran a battery of tests

but couldn’t make sense of his patient’s symptoms. A security guard

at a different Manhattan hospital, the man didn’t appear to have

any underlying diseases. Between gasping breaths, he said he had already

visited several New York hospitals and doctors. No one could

help him.

The man’s heartbeat raced and his oxygen- saturation levels plummeted.

Ninety-five percent … ninety-four … ninety-three. Any

lower and death was a risk. Masur couldn’t figure out what he was

dealing with. A bad strain of tuberculosis? A new fungus? Something

more dangerous? He consulted colleagues and scoured medical literature

but couldn’t find an answer.

Time was running out. Masur still needed more information. He

decided to operate, a huge risk given his patient’s frail condition.

I have to get a piece of his lung.

Hours later, a hospital pathologist looked up from his microscope

and delivered an answer: The young man had pneumocystis carinii

pneumonia.

Masur was stunned.

How can that be?

Masur happened to be one of the rare experts on this kind of

pneumonia. A few years earlier, when he was starting a fellowship in

infectious diseases and tropical medicine, he found himself the most

junior recruit in his laboratory, with little choice of which microorganism

to study. Malaria and all the other cool, headline-grabbing

infections, bugs, and epidemics had already been claimed. Masur got

stuck with pneumocystis pneumonia. His colleagues tried to stifle

their giggles. At one time, this pneumonia afflicted hundreds of malnourished

children each year in Eastern Europe and elsewhere. By

the late 1970s, though, it affected only about seventy patients in the

entire United States each year, almost always those with compromised

immune systems, such as cancer patients. Masur’s lab director

assured him that there was value in studying the infection, but Masur

knew he was unlikely ever to encounter an actual patient dealing

with pneumocystis pneumonia.

Now, it was the fall of 1979, and Masur was confronting a case —
during his very first week as an attending physician. And his patient

was a healthy adult. It didn’t seem remotely possible.

Masur decided to administer a drug being tested on childhoodleukemia

victims suffering from the same infection. Masur’s patient’s

condition stabilized enough for him to eventually leave the hospital.

Before Masur could relax, though, he was confronting additional cases

of this rare pneumonia. So were physicians in other New York hospitals,

as were doctors in Chicago, Atlanta, Los Angeles, and San Francisco.

Six foot one and rail thin, with a high forehead and jet-black hair,

Masur was a ponderer, especially when confronted with important

decisions and quandaries. He tended to fixate on problems until he

could develop a solution. Walking home after another long day,

Masur slowly crossed the street to his one-bedroom hospital-owned

apartment, trying to make sense of all the men with the once-rare

infection. At night, he and his wife, a nurse at the hospital, debated

the cases, searching for an explanation.

Within months, Masur’s original patient had died, and more

struggling, suffering young men were coming to see Masur and his

colleagues around the world. In London, patients battling pneumocystis

pneumonia and other puzzling infections and tumors staggered

into St. Stephen’s Hospital in Chelsea. Doctors at the hundred-year-old

institution noticed the patients were likely to be either gay men or

intravenous drug users, but those observations didn’t explain the
nature of the illness or how it could be halted. They knew something

was making the men susceptible to these rare illnesses, but they had

no idea what it was.

The physicians were reeling. Just a few months earlier, they had

been confident and upbeat. In the previous decade or so, enormous

progress had been made preventing and treating all kinds of illnesses,

including heart disease, diabetes, and some cancers. Powerful

antibiotics and accurate diagnostic tests had been introduced, and

modern medicine seemed on the verge of wiping out most infectious

diseases.

Now the physicians were confronting a malady they couldn’t stem,

treat, or even understand. Fear and frustration overtook them.

 


“All of a sudden, you’re seeing patients your own age and we can’t treat them, and we don’t even know what it is they’re dealing with.”


 

“Apart from compassion and giving some pain relief, we were

completely impotent,” recalls Jeremy Farrar, who was a young doctor

at St. Stephen’s in London. “It leaves a scar the rest of your life.”

Infectious-disease experts felt especially helpless. Many had entered

the field precisely because they wanted to cure patients, not

simply improve a condition or keep illness at bay, which is often the

most doctors can hope to do while treating cancer, cardiovascular

disease, or certain other sicknesses.

“I like seeing a patient, making a diagnosis, giving treatment, and

seeing them get better,” says H. Clifford Lane, who worked in a lab

at the National Institute of Allergy and Infectious Diseases (NIAID),

a part of the National Institutes of Health. “Now, all of a sudden,

you’re seeing patients your own age and we can’t treat them, and we

don’t even know what it is they’re dealing with.”

Researchers concluded that most people were contracting the unknown

illness through sexual transmission, as a virus crossed mucosal

tissues lining the genital tract, rectum, or other body cavities. Others

were becoming infected through the bloodstream, sometimes by sharing

needles. In 1982, the Centers for Disease Control and Prevention

in Atlanta gave the disease a name: acquired immune deficiency syndrome,

or AIDS. Investigators at both the Pasteur Institute in Paris

and National Cancer Institute in Washington, D.C., determined that

a new human retrovirus, eventually named the human immunodeficiency

virus, or HIV, was causing AIDS.

The virus spread quickly. At Albert Einstein College of Medicine in

New York, five Black infants were admitted showing signs of severe

immune deficiency. Anxieties grew, even among professionals accustomed

to disease and death. Some pathologists refused to do postmortems,

worried they might contract the new disease. Fears raced through

the broader society. Later, when an Indiana teenager named Ryan White

was infected through contaminated blood products used to treat his

hemophilia, anxious parents forced the school to block the boy from

attending class.

Government officials tried to understand the disease and how it

might be stopped, though some started from positions of remarkable

ignorance. In 1983, as staffers and health officials briefed Margaret

Heckler, the secretary of health and human services, about the disease,

she appeared confused about one way AIDS was transmitted.

“Anal intercourse?” Heckler asked, turning to a close aide who was

gay. “You do that?”

“I think we better come back and discuss this a little later,” another

staffer told Heckler.

As scientists gained a better grasp of the disease, some became

optimistic they could develop a vaccine, perhaps quickly. Sure, it had

taken about five decades to develop shots for typhoid, polio, and

measles after their causes had been determined, but medical science

was progressing at a rapid clip. At a press conference on April 23,

1984, Heckler voiced confidence a solution was on the horizon.

“We hope to have such a vaccine ready for testing in approximately

two years,” she told reporters.

Government scientists who followed Heckler to the podium were

nearly as sanguine. They had history on their side: Traditionally, vaccines

were how most epidemics ended. Indeed, few figures are as revered

as those responsible for creating shots capable of wiping out

plagues and disease. These scientists often emerged living legends —

even when their contributions were a bit exaggerated.

 

***

 

In the summer of 1774, a farmer in southern England named Benjamin

Jesty noticed that one of his dairymaids seemed resistant to

smallpox. Earlier that year, the young woman, Anne Notley, had

cared for a family afflicted with smallpox, a disease that killed three

in ten infected people while causing blindness and other complications

in others. Yet, she emerged unscathed. Jesty knew that Notley,

like other milkmaids, had previously been infected with a less serious

but related pathogen called cowpox, which spread from the udders of

infected cows.

Jesty had an idea: He took one of his wife’s knitting needles and

scraped pus from one of his cows showing signs of cowpox. Then, Jesty

intentionally infected his family with the material. Later, when an

outbreak of smallpox raged through the region, members of the Jesty

family were protected from the disease. Pushing his luck a bit, Jesty deliberately

infected his sons with smallpox—still no sign of infection. (1) Far

from impressed, locals were in fear. Some fretted that Jesty’s needlework

would turn his family into “horned beasts.” Eventually the Jesty clan

was forced to flee for the Isle of Purbeck on the English Channel.

Word spread about Jesty’s inoculations (and how the family remained

hornless), and British doctors began attempting similar procedures.

In 1796, a physician named Edward Jenner exposed an

eight-year-old boy to cowpox; when Jenner later infected the boy with

smallpox, the cowpox protected him from the disease without signs

of even localized inflammation or infection. Jenner inoculated others

as well. Unlike Jesty, Jenner evaluated his subjects, analyzed their

results using proper scientific methods, and published his findings.

 


“Because vaccines usually include weakened viruses, inactivated viruses, or some other facsimile of the real thing, it’s rare that they infect the body with the disease they’re meant to prevent.”

 


 

Before long, the country, and later the world, would embrace vaccination

as a means to eradicate smallpox. Jesty’s ingenuity would be

overlooked when Jenner’s biographer attributed the genesis of mankind’s

first vaccine to Jenner’s observation that a beautiful local

milkmaid was resistant to smallpox. The milkmaid’s image proved

more memorable than Jesty’s dirty needles.

Other vaccine pioneers demonstrated their own originality, even

as they generated other kinds of controversy. In the 1940s, for example,

a young virologist named Jonas Salk began publishing academic

papers that were imaginative in their conclusions, though they

sparked criticism, partly because the stated results were often based

on limited data.

“I engaged in extrapolation because I had always felt that it was a

legitimate means of provoking scientific thought and discussion,” Salk

later explained. “I engaged in prediction because I felt it was the

essence of scientific thought. The fact that neither extrapolation nor prediction

was popular in virological circles seemed to me to be a shame.”

Salk spent several years searching for a vaccine for polio, an infectious

disease that was killing thousands of people a year and paralyzing

tens of thousands, many of them children. At the time, most scientists

were trying to use live, but weakened, viruses in their vaccines,

similar to Jesty’s approach with smallpox. Salk tried a different tack:

He grew samples of the polio virus in his lab at the University of

Pittsburgh and killed, or inactivated, the pathogen by adding formaldehyde,

a method that had worked for vaccines for rabies and cholera.

Salk tested his shots on thousands of children, and even his own family,

showing in 1953 that they worked more than 60 percent of the

time. His results sparked singing, dancing, and other celebrations

throughout the United States as a grateful nation embraced Salk as a

hero, his image appearing on the front pages of newspapers, the covers

of glossy magazines, and on television newscasts. Later, Salk’s bitter

rival, Albert Sabin, introduced an oral polio vaccine based on a

weakened, or attenuated, version of the virus, and it too proved effective.

Together, the two vaccines effectively ended the scourge of polio

for much of the world.

All vaccines work more or less the same way — by teaching and

enabling the body’s complex immune system to fight off pathogens.

The human immune system features two lines of defense. A fast-acting,

first-line “innate” immune system is composed of various white

blood cells, such as macrophages, dendritic cells, and natural killer

cells that stand guard at the body’s gateways — the skin, nose, throat,

etc. — to detect and fend off viruses and other foreign invaders.

The innate immune system doesn’t need prior exposure to a pathogen

to be activated against it, but it can have trouble handling especially

powerful or clever pathogens. For these difficult battles, the

body’s “adaptive” immune system joins the fight. Sensing danger, it

sends other kinds of white blood cells, including T cells, which can

recognize specific pathogens, and B lymphocytes, or B cells, which

produce powerful antibodies to battle the pathogens.

These cells do a more efficient job than those of the innate immune

system. T cells play important defensive roles, while B cells produce

battalions of antibodies specifically trained to take on invaders. The

problem is that the adaptive immune system is strong but a bit slow. It

takes time deciding whether an invader is dangerous enough for it to

send sufficient T cells and B cells to combat the intruder, giving a

virus the opportunity to strengthen its hold and infect the body’s cells.

That’s where vaccines come in. Injected into the body’s bloodstream,

traditional vaccines contain weakened or killed versions of

what would otherwise be powerful pathogens. Once introduced into

the body, the invading agents trigger the body’s adaptive immune

system to pursue and disable them. The pathogen in the vaccine is

harmless, but the body fights it off nonetheless, treating the weakling

force as if it were a threatening army. The adaptive immune system,

unable to shake memories of this simulated battle, continues to send

antibodies to patrol for new signs of the pathogen, while training

them to return to attack mode if there’s an invasion of a genuine foe

bearing similarity to the one encountered as part of the vaccine.

Because vaccines usually include weakened viruses, inactivated

viruses, or some other facsimile of the real thing, it’s rare that they

infect the body with the disease they’re meant to prevent. Years after

Salk’s and Sabin’s breakthroughs, scientists would introduce vaccines

that rely on other approaches, but they would have the same

goal: activate and teach the immune system to disable a future intruder.

 

(1) They did suffer inflammation, but it was probably the result of Mrs. Jesty’s dirty needles.

 

Extracted from A Shot to Save the World by Gregory Zuckerman, out now.

 

Facebook  Twitter