December 12, 2019 · 15 min read
The conquest of infectious disease is the greatest triumph of medicine, and one of the greatest stories in the annals of human progress.
A CDC report says that in the US in 1900, “the three leading causes of death were pneumonia, tuberculosis (TB), and diarrhea and enteritis, which (together with diphtheria) caused one third of all deaths. Of these deaths, 40% were among children aged less than 5 years.” Many diseases acted quickly—malaria, smallpox, pneumonia or cholera could kill in weeks or even days—and there was no diet or exercise regimen associated with lower risk.
Since then, deaths from infectious disease in the US have declined by more than 90%. Outside of countries that are too poor to have good water sanitation or mosquito control, infectious disease is relatively rare, and usually curable.
What were these diseases? And how did we reduce them so drastically?
To survey the field, I made a list of diseases in a fairly non-rigorous manner from multiple sources, including a CDC report, a WHO report, a survey in JAMA, and a long list in Wikipedia. I filtered somewhat subjectively for current and historical importance, and ended up with a list of 45 diseases or types of infection. For each, I looked up the cause, and classified it according to the type of microorganism and mode of infection. (You can see the full list here.)
Infectious diseases are often classified by the biological category of the microorganism that causes them. Most often, the culprit is a species of bacteria or virus; occasionally, it’s a protozoan, a fungus, or even a prion. This classification is helpful to understand what type of drugs might cure the infection (see below).
But I found it most helpful to classify them by how they enter the body and thus how they spread through a population. (The following is my own classification, and not from any medical text.)
Opportunistic infections. These get in the body by bypassing its normal defenses, especially by going directly into a wound, whether from an accident, a battle, or a surgery. Unlike other diseases, these usually aren’t caused by a specific virus or species of bacteria; they can be caused by any of many different types of germs that are commonly present in the environment, which all cause similar symptoms. In this category I place the “suppuration” or putrefaction of battlefield and surgical wounds, some common forms of pneumonia, and childbed or “puerperal” fever, which is an infection that occurred often in hospital maternity wards before antiseptics. (I also tentatively include tetanus in this category, although that is caused by a specific bacterium with identifiable symptoms, because it also tends to enter through a wound.)
Animal-borne diseases. These are contracted through contact with animals, usually insects and other pests, including mosquitoes, lice, ticks, fleas, rats, etc. This category includes the so-called “tropical diseases”, which at one time were far more widespread than the tropics. Pest-borne diseases include most notably malaria, but also yellow fever, typhus, and the bubonic plague that was responsible for the infamous Black Death. A few other diseases are spread by pets (rabies) or livestock (anthrax).
Ingested diseases spread through contaminated water or food. This includes cholera (of the famous Broad Street outbreak), typhoid fever (not to be confused with typhus, which is a different disease), and some forms of gastroenteritis (infection of the stomach and/or intestines, often resulting in diarrhea, including dysentery).
The relevance of these categories will become clear when we look at methods of prevention.
Out of the list of diseases, I somewhat arbitrarily chose twenty that seemed most deadly and/or historically significant, trying to get at least a couple in each of the categories above. Then I looked up, for each one: What was the first highly effective technique against it? And what other significant techniques have been applied? From this survey I got a good overview of humanity’s weapons against disease. I group them into three broad categories:
Environment. The first line of defense is to prevent live germs from ever entering the body: removing them from the environment, killing them in the environment, or using barriers to contain their spread (such as plastic gloves or bags). This includes antiseptics, sterilization, pest and animal control, sanitation, condoms, and general hygiene such as hand-washing. Effective control of the environment prevents infection completely.
Immunity. If germs do enter the body, the best defense is the body’s own immune system, if primed to counteract the disease. An effective immune response means that the germs cannot multiply, symptoms do not manifest, and thus an infection never turns into a full disease. Smallpox inoculation was the first form of immunization, but today our tool for this is vaccination.
Pharmaceuticals. If a disease does manifest, the last line of defense is drugs. By far the most effective drugs we have are antibiotics; pretty much any bacterial infection can be cured with them (other than resistant strains). Antitoxins are another successful class of drugs, and there are also scattered anti-protozoans (quinine for malaria), antivirals (which have had success treating AIDS), etc.
Environmental control and immunization are both ways to prevent disease; pharmaceuticals are a way to cure disease that we fail to prevent.
Let me elaborate on each of these and give a bit of history.
Isolation and quarantine could be considered the earliest attempts at environmental control. Long before the germ theory was established, it was clear from common-sense observation that many diseases were contagious. Keeping disease victims away from a town or forcing them to isolate themselves was a natural response. This was particularly important, and relatively effective, when applied to arriving ships, which could be practically detained for long enough to detect any disease—the word “quarantine” is derived from Italian for “forty”, which was the number of days this isolation would last. Quarantine worked to some degree (it kept smallpox out of Australia), but it couldn’t be enforced against every oxcart and vagrant coming in by the roads, and in the end it wasn’t effective enough to defeat any major disease.
Antiseptics were the first highly effective environmental strategy. Prior to the late 1800s, the sanitary practices of doctors and hospitals were atrocious. Doctors would perform surgeries or even autopsies and move right on to the next patient without so much as washing their hands. Why not? To the extent there was any idea of contagion before the germ theory, people generally thought that the seeds of disease traveled through the air, or they blamed rotting plant or animal matter. Contaminated water or bodily fluids from live patients were not particularly suspect.
The lack of hygiene was particularly dangerous for two classes of patients: those undergoing surgery, and women giving birth. Both had alarming fatality rates until proper sanitary procedures were developed. Ignaz Semmelweis, a Hungarian physician, was the first to propose that contagion was the cause of childbed fever, and he suggested a solution. He knew the characteristic smell of the disease, and that chlorine could remove such smells. Based on this, in 1847 he proposed that doctors wash their hands in chlorine before performing a delivery, which dramatically reduced maternal mortality at the Vienna General Hospital. Later, in 1865, Scottish surgeon Joseph Lister had been pondering the problem of the suppuration of surgical wounds. Based on a report from Pasteur that microorganisms were commonly found in the atmosphere, he hypothesized that infection was the cause of this disease, and developed a method to kill germs in the environment using carbolic acid. Lister’s method (along with anesthetics, first demonstrated in 1846) transformed surgery from a nightmarish procedure used only as a last resort in life-threatening situations, to a relatively safe and effective treatment.
Antiseptics, and related techniques such as sterilization of instruments, were thus effective against the opportunistic infections that occurred in the unique environment of the hospital, where patients were unusually susceptible. Other disease, however, lurked in the general environment.
Sanitation was needed to protect against diseases from contaminated food and especially water. As John Snow showed in 1855, cholera could be traced to water supplies contaminated by sewage, and it would turn out that typhoid fever was also water-borne. The most effective techniques, from what I can tell, were to get water from clean sources, pass it through filtration, kill germs by the addition of chlorine, and make sure that sewage was disposed of downstream from sources of drinking water, and ideally diluted.
Sanitation and general hygiene were promoted by a variety of reformers in the mid-1800s and early 1900s. England had Edwin Chadwick, who wrote the influential report Sanitary Condition of the Labouring Population of Great Britain in 1842. The hospitals had Florence Nightingale, who pushed for hygienic reform in the Crimean War and among the British Army in India. Germany had Max von Pettenkofer, who persuaded the city of Munich to bring in clean water from the mountains; from 1880 to 1898, the mortality from typhoid fever in Munich fell by more than 80%, from 72 to 14 per million. Encouraged by successes like these, US cities began to follow suit in the early 1900s.
The irony is that some of these reformers did not associate their efforts with disease prevention, and not all of them even believed in the germ theory. Nightingale ridiculed the idea that disease could only come from contagion, claiming that she had seen cases of smallpox and other diseases caused by crowding in hospital wards. Pettenkofer emphasized environmental conditions in the development of cholera, including the soil, in contrast to the strict bacterial theory; after Robert Koch identified the Vibrio cholerae bacterium in water, Pettenkofer made a public demonstration of drinking a large dose of the germs, in order to prove that they alone could not cause the disease. (Somehow, through previous immunity or sheer luck, he suffered only minor symptoms.) But the efforts of these activists had good results, even if partly by accident.
Pest control was an important related effort. Cuban epidemiologist Carlos Finlay first showed in the 1880s that mosquitoes were the vector for the spread of yellow fever; in 1898, the same was shown to be true of malaria. With this key insight, we learned to control tropical diseases by driving mosquitoes away from human habitation and reducing their numbers, especially by draining or covering up the still waters that they need to reproduce. One of the first such campaigns was waged by the US Army during the construction of the Panama Canal. Later, with the invention of insecticides that were not harmful to humans, such as DDT, insect populations could be reduced even further, and malaria was effectively eliminated from the developed world by the mid-1900s. Typhus is spread by lice, and plague by fleas and rats, so pest control was important for a number of diseases.
Summing up: antiseptics and sterilization killed the germs that caused opportunistic infections, especially in hospitals. Sanitation, especially for water/sewage, and later for food handling, eliminated germs from what we consume. And pest control defeated most of the animal-borne diseases.
But environmental control could not defeat the contagious diseases that are passed directly from person to person. Humans are social animals, and it’s too hard to keep us apart. To solve the problem of highly contagious diseases, and STDs, required other weapons.
Before Pasteur, humanity had an immunization technique for exactly one disease: smallpox. As told in depth in a recent post, an ancient folk practice known as inoculation or variolation consisted of deliberately infecting the patient with smallpox virus via a scratch on the arm. When contracted this way, the disease was milder and far less deadly, and it still conferred lifetime immunity. Originating in China, India, and the Middle East many centuries ago, the practice was introduced to Europe and the Americas starting in 1721. In 1796, Edward Jenner discovered that immunity to smallpox could also be gained by the same procedure but substituting the cowpox virus, which was even milder, and never deadly. The latin for cow is “vacca”, and the technique became known as vaccination.
Neither Jenner nor anyone else understood why these techniques worked, or what caused any disease in the first place, so for generations, the techniques of inoculation and vaccination could not be applied to any other disease—until the work of Louis Pasteur. In studying diseases of livestock, Pasteur discovered—by chance, although as Pasteur himself often said, “chance favors the prepared mind”—that it was possible to “attenuate” a germ, reducing its potency to cause disease, while retaining its ability to confer immunity. He developed multiple methods for this, including the method of passing a disease through multiple generations of hosts, causing it to quickly evolve to adapt to that host—which simultaneously can make it less able to replicate in a different species of host. By passing a virus, for example, through animals, or even chicken embryos, it can lose much of its ability to replicate in humans. Pasteur first developed vaccines for livestock (chicken cholera, swine erysipelas, anthrax), but the crowning achievement of his career was a vaccine for rabies, in 1885.
Vaccine development proceded relatively slowly after Pasteur, for reasons I don’t fully understand (see below). A vaccine for typhoid fever was developed in 1896, and was used on soldiers in the Boer War and in WW1. Vaccines for diphtheria, tetanus, whooping cough and yellow fever came in the 1920s and ’30s. Thomas Francis developed the “flu shot” in the 1940s, and his student Jonas Salk the vaccine for polio in 1955 (these were the first “killed-virus” vaccines). Measles, mumps and rubella came in the 1960s; vaccines for less deadly diseases, such as chickenpox, HPV, and hepatitis, were not developed until recent decades.
New techniques were invented along the way. Not all vaccines use attenuated germs anymore: some use inactivated (or “killed”) versions; some consist of only a piece of the germ, such as a protein, that cannot itself replicate or cause disease; and some, for toxin-producing bacteria, use a version of the toxin itself rather than the germ.
Today the CDC recommends 17 vaccines for routine immunization, which is basically every major disease we have a safe, effective vaccine for, except for the ones that have been effectively eliminated through sanitation, pest control, or eradication campaigns. Other vaccines are recommended for travelers going to areas with contaminated water or dangerous insects.
Immunization is an excellent strategy—when it’s possible. But when a vaccine hasn’t been developed yet, or is only partially effective; or if an individual hasn’t been immunized, or can’t be for medical reasons—disease can still develop. And to cure a disease after it has developed requires entirely different tools.
Effective drugs to treat disease were few and far between until the late 1800s. Malaria could be treated with some effectiveness using quinine, derived from the bark of the cinchona tree, but there was little else.
Antitoxins were the first successful strategy for producing effective drugs. Some diseases, such as diphtheria and tetanus, cause their symptoms because of a toxin produced by the microbe. These diseases can be treated without killing the microbe itself, by neutralizing the toxin. Some animals produce natural antitoxins in response to these diseases, and so, when infected, the blood serum of such an animal is an effective treatment. Horses were used as factories to produce diphtheria antitoxin starting in the 1890s.
But not all diseases operate via toxins. The ultimate cure is to kill the microbe itself, or to prevent it from multiplying. Outside the body, bacteria can be killed relatively easily with heat, acid, or bleach, as discovered by Semmelweis, Lister, Pasteur, and others. But these treatments are also harmful to humans, especially when taken internally. How to kill the bacteria after it has invaded, without harming the host?
Antibiotics were the vision of Paul Erlich (the 19th-century German physician, not the 20th-century American Malthusian of the same name). He was an expert in the technique of staining tissues in order to study their structure under the microscope. In this process, a dye selectively binds to certain cells or tissues, leaving others unstained. If a chemical can selectively dye certain bacteria, Erlich thought, why can’t it selectively kill them? Indeed, the first antibiotics experiments were based on modified dye compounds, and Erlich’s lab discovered the first successful antibiotic drug in 1909, a cure for syphilis marketed as Salvarsan.
Salvarsan, however, was a very narrowly focused drug, effective only against one type of bacteria and one disease. The golden age of antibiotics began in the 1930s with the invention of a class of drugs, the sulfa drugs or sulfonamides, that attacked a whole class of bacteria, including many of the ones that caused pneumonia and wound infections. Penicillin, famously discovered by Alexander Fleming in 1928 and developed into a practical drug by Howard Florey’s lab in the 1940s, was even more effective for these diseases, and also against others including diphtheria and scarlet fever. By the 1950s, more classes of antibiotics had been developed, including “broad-spectrum” antibiotics that were effective against an even wider range of bacteria, curing tuberculosis, whooping cough, cholera, typhus, typhoid fever, and more.
Antivirals have proved much harder. Bacteria are cells separate from the body; to target them, drugs can interrupt their biochemistry and life processes, which are distinct from the host’s. But viruses replicate inside the body’s own cells, hijacking their replication machinery to do so. To interrupt a virus requires interfering with processes inside human cells, an even more delicate operation than killing foreign cells inside human blood vessels and tissues. The only significant success I know of with antivirals has been in the control of AIDS, and even here, we have not been able to completely cure the disease, but only to keep it in remission.
Of these techniques, which was most important in the conquest of disease?
Just eyeballing the list, vaccines and antibiotics seem to have been effective against the greatest number of diseases, both being very general techniques. Antiseptics, sanitation and pest control are more focused techniques that have each applied to a few diseases within their purview.
But a full answer to this question should be quantitative, looking at disease incidence and mortality, and attempting to tease apart the effect of different techniques applied to each disease. I’m researching this now and would appreciate pointers to any relevant papers or data sets.
I am struck by the fact that the development of vaccines seems very slow, especially when compared to antibiotics.
Even if we ignore Jenner’s smallpox vaccine (1796) and start the clock with Pasteur’s rabies vaccine in 1885, we get: one a decade later (typhoid fever), few if any for another ~30 years, and then only 2–3 per decade from the 1920s to today. And we’re still missing highly effective vaccines for some major diseases, including malaria, syphilis, and AIDS.
In contrast, an enormous range of antibiotics were discovered very quickly: the sulfonamides, penicillin, streptomycin, tetracycline, chloramphenicol, and erythromycin (along with many variants) all came in the space of just two decades, from the mid-1930s to the early ’50s. And this included antibiotics effective against many kinds of bacteria, the result being that today we have treatment for every major bacterial disease (modulo resistant strains).
Why? Right now I can only speculate:
Immunity is by its nature specific, there’s no such thing as a “broad-spectrum” vaccine. This makes vaccines harder; each one is unique.
Antibiotics exist in nature, produced by fungi and other organisms; once we realized this, we just had to go out and look for them (mainly in the soil). Vaccines, in contrast, are a bespoke human creation: you’re looking for something that specifically does not replicate well in a human host, which is exactly what evolution selects against.
Vaccines often immunize against viruses, which are more difficult to work with. Bacteria can be viewed under an ordinary microscope, and can be grown in a petri dish; the techniques for this were developed in the 1870s by Robert Koch and his lab (including the guy named Petri who invented the dish). Viruses could only be cultured in live plants and animals until the 1900s, when methods were developed to grow them in tissue cultures (and later in eggs and even cell cultures), and they can only be seen with electron microscopes.
Perhaps the cycle time is inherently longer? Antibiotics kill bacteria in vitro, and work in a matter of hours or days. In contrast, to test a vaccine, you need to first vaccinate an animal model, then wait for the vaccine to take effect, then attempt to induce the disease itself, and wait to see whether symptoms develop.
Vaccines may have a certain scare factor associated with them that mandates more caution. If an antibiotic goes wrong, of course, it could have toxic side effects, which must be carefully tested for. But if a vaccine goes wrong, it can sometimes give you the very disease you’re trying to prevent. This happened in the 1930s with polio, when two different labs rushed into clinical trials too soon, and subsequent researchers were more cautious.
I’d appreciate comments from any experts who have a deeper understanding.
Attacking germs in the environment is ideal, and when possible we have addressed diseases this way. When this works, it has the benefit that it is solved for everyone in the environment. Because of the success of these techniques, the CDC does not recommend routine vaccination for diseases like cholera or yellow fever. The flip side of that is that you’re not protected when you go to a region that doesn’t have these controls, such as when visiting countries without water filtration—which is why certain vaccines are recommended for travelers. More importantly, purely environmental control doesn’t work for all diseases: contagious diseases that spread person-to-person directly are simply too difficult to control.
Immunization thus provides an alternate means of preventing disease. It has the advantage of being effective for an individual regardless of what environment they’re in (which is why you get shots before traveling). It also creates “herd immunity” among a population, which protects against epidemics. But vaccines are difficult, and we don’t have them for everything. There’s a broad range of bacteria in the environment, and we can’t vaccinate against all of them. They must be used preventatively: if you start to develop symptoms of a disease, it’s too late for a vaccine to help. Some people can’t receive certain vaccines because of allergies, pregnancy, a weakened immune system, or other contraindications. And young children haven’t gotten their full schedule yet.
So we need drugs too, as that crucial last line of defense. If all else fails and you actually develop a disease, drugs are the only thing that will help you. Antibiotics provide defense in depth, curing disease when vaccination failed or was not possible, or when something slipped through our environmental controls.
Together these techniques have been massively effective. The burden of disease has shifted to the old, with long-term risks such as cancer and heart disease now the leading causes of death. Given the historical rate of child mortality from infectious disease, everyone who survived to adulthood should give thanks to Jenner, Semmelweis, Chadwick, Nightingale, Pasteur, Lister, Koch, Erlich, Fleming, Florey, Francis, Salk, and all the others who created these breakthroughs.
« Were vaccines relevant to 20th century US mortality improvements? Polio and the controversy over randomized clinical trials »