Overview: Medicine 1900-1949
Overview: Medicine 1900-1949
Overview
The twentieth century was a period of rapid scientific development and unprecedented progress in the biomedical sciences. During the first half of the twentieth century, advances in science—especially in microbiology, immunology, biochemistry, endocrinology, and nutrition—revolutionized medical theory and practice. Even though anesthesia and antiseptic techniques had transformed surgery in the nineteenth century, surgeons could not cope with blood loss, shock, or postsurgical infections until well into the twentieth century. Advances in biochemistry and physiology led to the development of more precise diagnostic tests and more effective therapies. The development of new instruments and laboratory techniques accelerated the growth of specialization within the medical profession and led to major changes in clinical medicine.
The identification and isolation of the microbial agents that caused many of the most important infectious diseases facilitated public health campaigns against epidemic diseases. Advances in preventive and therapeutic medicine correlated with remarkable changes in disease patterns and mortality rates. Striking increases in life expectancy also reflected broader social factors, such as improvements in nutrition, housing, sanitation, and health education. At the turn of the century, life expectancy in the United States was only about 50 years. By the 1950s, however, life expectancy was approaching 70 years. Similar changes occurred in other industrialized nations. In general, life expectancy increased for all groups, but significant differences were closely correlated with race and gender. In the United States, life expectancy was longer for women than men and longer for whites than nonwhites. American medical records, including reports of births, deaths, and specific diseases, however, are not very accurate for the period before the 1930s. In 1900 influenza and pneumonia, tuberculosis, and gastritis were the top killers, but by 1950 heart disease, cancer, and cerebrovascular disorders were the major causes of death.
Research on the nature of infectious diseases and their means of transmission led to the discovery and classification of many pathogenic organisms. In addition to bacteria, scientists identified the rickettsias, which cause such diseases as typhus, and pathogenic protozoans, such as those that cause malaria. Some infectious diseases were attributed to mysterious microbial agents that were invisible under the microscope and small enough to pass through filters that trapped bacteria. These entities were operationally defined as invisible, filterable viruses. Although viruses clearly were able to reproduce and multiply in plant and animal hosts, they could not be cultured in the laboratory. Early studies of viruses included Peyton Rous's demonstration in 1910 that a virus could transmit a malignant tumor in chickens. In 1916 Frederick Twort and Félix-Hubert d'Hérelle independently discovered bacteriophages, which are viruses that attack bacteria and multiply within them. Wendell M. Stanley showed that the tobacco mosaic virus could be purified and crystallized and still maintain its infectious qualities.
Advances in immunology and chemotherapy followed Paul Ehrlich's successful search for a "magic bullet," that is, a chemical that could destroy pathogenic microbes without seriously damaging the host. In 1910 Ehrlich and Sahachiro Hata discovered that arsphenamine, or Salvarsan, was effective in the treatment of syphilis. The next major advance in chemotherapeutics was a report by Gerhard Domagk in 1932 that the red dye Prontosil was effective against streptococcal infections in mice and humans. After other researchers found that the active antibacterial component in the dye was sulfanilamide, many derivatives were synthesized and tested for efficacy and safety.
Alexander Fleming had discovered penicillin in 1928, before the introduction of the sulfanilamides, but Fleming was unable to purify and test the antibiotic. About 10 years later, Howard Florey, Ernst Chain, and others isolated penicillin and tested its potency and toxicity. Production of penicillin began during World War II and by 1944 the drug was being used to treat soldiers with infected wounds and infectious diseases. Unfortunately, penicillin was not effective against certain bacteria, including Mycobacterium tuberculosis, the bacillus that causes tuberculosis, which was still a major public health problem in the 1940s. In 1944 Selman A. Waksman and his colleagues isolated streptomycin and demonstrated that it was active against M. tuberculosis. The widespread use of antibiotics soon led to the emergence of drug-resistant strains of bacteria, which pose a significant threat to victims of disease and infection. Research on the mechanism of immunity led to diagnostic tests for several diseases, including syphilis (the Wassermann test) and tuberculosis (the tuberculin test).
Within a few years of the discovery of x rays by Wilhelm Conrad Röntgen in 1895, numerous medical applications for x rays were established. X rays were used to analyze fractures and locate stones in the urinary bladder and gallbladder. Doctors introduced radio-opaque substances into the body to study the kidneys, spinal cord, gallbladder, ventricles of the brain, the chambers of the heart, and the coronary arteries.
Military medicine, especially during major armed conflicts, provided direct demonstrations of the value of advances in sanitation, nutrition, and medical techniques. For example, the effectiveness of a vaccine against typhoid fever, developed in the 1890s, was established during World War I. The use of tetanus antitoxin for all wounded men during World War I virtually eliminated the threat of battlefield tetanus.
During the 1930s the use of vaccines, or toxoids, led to the control of tetanus and diphtheria. Widespread immunization of children against diphtheria virtually eliminated the disease in the United States and other industrialized nations. A live, but weakened (attenuated) vaccine known as bacillus Calmette-Guérin (BCG), which was developed by Albert Calmette and Camille Guérin, has been used in Europe since the 1930s; since the infamous Lübeck disaster of 1930, however, its efficacy and safety have been questioned. In 1930, after 249 infants had been vaccinated with BCG vaccine in Lübeck, Germany, 73 died. After intensive investigations, scientists concluded that the vaccine was safe if properly prepared. The vaccine used in Lübeck had been contaminated by virulent bacteria.
Little progress was made in understanding viruses until the 1930s, when researchers learned to use tissue-culture techniques to grow viruses in the laboratory and the electron microscope to provide portraits of viruses. Scientists were then able to produce vaccines for yellow fever, influenza, and poliomyelitis. One of the most destructive epidemics in history was the influenza pandemic that killed more than 15 million people between 1918 and 1919. Because the influenza virus has the ability to transform itself from one epidemic year to another, outbreaks of influenza throughout the world must be carefully monitored so that appropriate vaccines can be developed.
The science of endocrinology evolved rapidly after 1905, the year that Ernest H. Starling introduced the term "hormone" for the internal secretions of the endocrine glands. During the first two decades of the twentieth century, various hormones, including epinephrine (adrenaline), were isolated and identified. The most notable event in this field was the discovery of insulin by Frederick Banting, Charles H. Best, and J. J. R. Macleod in 1921. In 1935 Edward C. Kendall isolated cortisone and in the 1940s Philip S. Hench and his colleagues demonstrated the beneficial effect of cortisone on rheumatoid arthritis. Cortisone and its derivatives proved to be potent anti-inflammatory agents that could be used in the treatment of rheumatoid arthritis, acute rheumatic fever, certain diseases of the skin and the kidneys, and some allergic conditions, including asthma.
Although interest in the relationship between foods and health is at least as ancient as Hippocratic medicine, the science of nutrition developed in the early twentieth century with the discovery of the "accessory factors," or vitamins. Frederick Gowland Hopkins proved unequivocally that a diet containing only proteins, carbohydrates, fats, and salt did not allow animals to grow and thrive. The classic experiments published by Hopkins in 1912 stimulated the isolation and characterization of the vitamins essential to health. Improvements in diet and the use of vitamin supplements made it possible to prevent specific vitamin-deficiency diseases, such as rickets (vitamin D), scurvy (vitamin C), beriberi (thiamine), and pellagra (vitamin B3 or niacin). The work of George H. Whipple and George R. Minot in the 1920s showed that raw beef liver was useful in the treatment of pernicious anemia, a previously fatal disease. The active principle, vitamin B12, or cobalamin, was isolated from liver in the 1940s.
Significant progress towards controlling major tropical diseases was made possible by the introduction of new drugs and vaccines, but mosquito control and other environmental interventions were also essential. After World War I, several synthetic antimalarial drugs, including quinacrine (Atabrine), joined quinine in the prevention and treatment of malaria. These new drugs were quite effective in reducing the burden of malaria among Allied troops during World War II. The hope that malaria might be eradicated by a direct attack on its mosquito vector was raised by the successful use of the insecticide dichlorodiphenyltrichloro-ethane (DDT) during World War II. Although DDT was initially quite effective in the battle against malaria and yellow fever, mosquitoes became resistant to DDT and ecologists found evidence of environmental damage caused by the insecticide.
Twentieth-century surgeons gradually adopted antiseptic and aseptic techniques, including the ritual of scrubbing up, the sterilization of all instruments and dressings, and the use of gloves and masks during surgery. Anesthesia was accepted, but its administration was often left in the hands of untrained assistants. Although chloroform was clearly more dangerous than ether, it was often used because it was easier to administer. Eventually, surgeons replaced chloroform with a combination of nitrous oxide, ether, and oxygen. In operations that required the relaxation of the abdominal muscles, deep anesthesia was used—despite the danger this posed to the patient. Improvements in anesthesia included intravenous anesthesia using the barbiturate thiopental sodium (Pentothal) and the injection of curare to induce muscular paralysis. The introduction of inhaled anesthesia administered under pressure was essential to the development of thoracic (chest) surgery. Eventually, anesthesiologists became skilled specialists, making complex operations possible. Harvey Williams Cushing's pioneering operations for brain tumors, epilepsy, trigeminal neuralgia, and pituitary disorders established neurosurgery as a model for other surgical specialists.
During World War I surgeons who had learned to practice asepsis had to treat contaminated wounds while working under primitive conditions. They were forced to revert to antisepsis and ancient wound treatments, such as debridement (removal of morbid tissue and foreign matter). To fight putrefaction and gangrene, Alexis Carrel and Henry Dakin introduced a method of antiseptic irrigation (the Carrel-Dakin treatment); antitoxin and antiserum were used to prevent tetanus and gangrene from developing after wound treatment. Military surgeons also realized the importance of the patient's rehabilitation. This discovery led to important developments in orthopedic and plastic surgery.
Even after surgeons learned to cope with pain and infection, shock remained a major problem both during and after surgery. When physiologists discovered that shock was essentially due to a decrease in the effective volume of blood circulation, doctors attempted to fight shock by transfusions. Safe blood transfusions were made possible by Karl Landsteiner's 1901 discovery of the ABO blood groups. During World War II blood banks were organized to support the increased use of blood transfusions.
A few isolated attempts at heart surgery occurred in the first decades of the century, but most such interventions were failures. The medical profession generally considered heart disease a problem requiring medical management, rather than surgical intervention. During World War II Dwight Harden demonstrated that it was possible to save the lives of wounded soldiers by removing shrapnel from the chambers of the heart. After the war, several surgeons performed operations to correct congenital defects of the heart. It was not until the 1950s, however, that surgeons were able to stop the heart while still supplying the body with oxygen in order to perform more complex operations.
As modern medicine succeeded in controlling many of the infectious, epidemic diseases, the chronic, degenerative diseases—heart disease, stroke, and cancer—emerged as the major causes of death in the wealthy, industrialized nations.
LOIS N. MAGNER