The 1950s Medicine and Health: Topics in the News
The 1950s Medicine and Health: Topics in the News
ASIAN FLU: A CRISIS DIVERTEDTHE BUSINESS OF HEALTH CARE: BILLIONS AND BUREAUCRACY
CANCER: WILL THERE EVER BE A CURE?
GERMS: WHAT THEY ARE, AND HOW TO DESTROY THEM
HEART DISEASE: AMAZING NEW TREATMENTS
"MIRACLE CURES" AND GREED: ANYTHING FOR PROFIT
MENTAL ILLNESS: A LACK OF COMPASSION
POLIO: DESTROYING A DEADLY DISEASE
RADIATION: A NEW WAY OF SEEING AND HEALING
SHAPING UP AMERICA'S YOUTH
TRANQUILIZERS: SIMPLE SOLUTIONS FOR COMPLEX PROBLEMS?
TUBERCULOSIS: THE DEMISE OF A DISEASE
ASIAN FLU: A CRISIS DIVERTED
In 1918, more people died in a worldwide flu epidemic than had been killed during World War I (1914–18). For this reason, Americans feared a similar catastrophe in 1957 upon the spreading of the Asian flu. Flu (or influenza) is a transmittable disease, caused by a virus, which results in a swelling of the respiratory tract, fever, muscular pain, and intestinal distress. This particular flu was a "Type-A" influenza, the most threatening of all flu viruses. Early in the year, it began appearing in northern China, hence, its name, and it threatened to become the century's deadliest epidemic.
Medical researchers worked quickly to develop a flu vaccine. The vaccine was first tested on fifty-five volunteers at the Maryland State Correctional Institute, and was found to be 70 percent effective. In June, the flu invaded the United States by way of travelers arriving in San Francisco from Asia. By then, many people in the most vulnerable occupations, such as hospital workers, already had been vaccinated. By September, thirty thousand flu cases were reported. However, by mid-month, over eight million doses of the vaccine had been produced. It then was estimated that by the end of the year eighty-five million doses would be available for use: a number that would allow for the vaccination of half the U.S. population. As a result of this thorough preparation on the part of the medical community, America was protected from the Asian flu.
THE BUSINESS OF HEALTH CARE: BILLIONS AND BUREAUCRACY
During the 1950s, great strides were made in medical science with regard to the prevention and cure of disease. Americans were astonished by these accomplishments, but they were also shocked when they learned their cost. Billions of dollars were spent on medical research and health care. Presidents Harry Truman (1884–1972) and Dwight Eisenhower (1890–1969), who, between them, served in the White House from 1945 through 1961, were outspoken supporters of federally sponsored medical research. Under their administrations, government health care and health research-oriented agencies flourished. In 1953, all such agencies were consolidated into the Department of Health, Education, and Welfare.
By 1956, more than $100 million was invested annually in medical research. Yet this remained a modest sum relative to the rise in the price of health care. Six years earlier, total health care costs in the United States were $8.4 million. By the end of the 1950s, they had soared to $17.2 million. Between 1950 and 1957, this rate of growth was 250 percent more than the rise in food costs, 160 percent more than the rise in housing costs, and over 175 percent more than the rise in the total cost of living. In 1959, the average family earned just over $6,600 annually but had medical expenses of $395, or about 6 percent of their income. The standard cost for a hospital room had escalated; the price tag for a day of hospital care was almost $200. Meanwhile, the average doctor earned $16,000 annually, while a surgeon made $25,000.
President Eisenhower warned that, if costs kept escalating, socialized medicine (medical care for everyone, paid for by the government) would result. Those who passionately believed in the private health care system argued that taking the profit out of medicine would remove the incentive for physicians and researchers to continue their good work. Those who supported socialized medicine believed that medical costs were rising at an alarming rate. Poor families were being completely shut out of medical care, and average ones soon would be unable to afford it.
Health insurance promised to insulate individuals and families from the risk of financial ruin due to runaway medical costs, but not everyone could afford coverage. In 1950, approximately one-half of all Americans were covered by health insurance; this percentage rose to 71 percent by the end of the decade. The remaining 29 percent translated into fifty million uninsured Americans. Meanwhile, physicians began to resist the mounting paperwork involved in filing insurance claims. Some busy doctors were forced to hire employees who did nothing but file the various insurance forms.
CANCER: WILL THERE EVER BE A CURE?
Of all diseases, cancer was arguably the most feared during the 1950s. Other maladies, such as heart disease, may have killed more Americans, but a diagnosis of cancer, most commonly defined as malignant (harmful, likely resulting in death) tumors that grow and spread anywhere in the body, was the equivalent of a death sentence.
At the beginning of the decade, some doctors considered cancer to be incurable. Many hospitals even felt it their duty to guard their cancer patients against intrusions by researchers. However, the American Cancer Society and the federal government began devoting enormous energy to fighting the disease, in the belief that researchers one day might uncover a cure. The American Cancer Society's annual budget rose from $4 million in 1947 to $110 million in 1959. In 1953, a cancer research hospital was opened at the National Institutes of Health complex in Bethesda, Maryland. By that time, health care specialists were more involved in cancer diagnosis. For example, women were advised to visit a gynecologist (a specialist in women's health and diseases), rather than a family doctor, for a Pap smear (a test to identify cervical cancer).
As the decade progressed, three types of cancer cures came to be considered: surgery, in which the malignancy was cut out; radiotherapy, in which the malignant cell growths were subjected to radiation (the process by which energy in the form of heat or light is emitted from molecules and atoms) in an attempt to kill them; and chemotherapy, not yet fully developed during the decade, in which a medicine was taken to attack the malignancy.
Cancer and Smoking
In the 1950s, the most debated cause of cancer was tobacco use. Early in the decade, researchers showed that mice painted with tobacco tars developed skin cancer. Scientists observed that humans who developed respiratory cancer often were smokers. In addition, they pointed out that smoking significantly contributed to the risk of cardiovascular (heart) disease.
America was a nation of smokers. Billions of cigarettes were produced and purchased each year. Cigarette manufacturers and their customers did not want to acknowledge any kind of link between smoking and cancer. In 1954, American cigarette companies formed the Tobacco Industry Research Council, a high-budget, high-profile public relations organization that defended the industry against criticism from health researchers.
As researchers labored to uncover a cure for the disease, alleged quacks and fakers took advantage of desperate cancer patients. One of the most well-known and most controversial of the fakes was Harry M. Hoxsey (1901–1973), who back in 1924 had begun selling pills and liquid medicines that he claimed were miracle cancer cures. Their ingredients included licorice, red clover, and varieties of plant roots and barks. The Food and Drug Administration (FDA) and state and federal courts ruled that Hoxsey's pills were worthless. But to some he was a folk hero, as a succession of his patients swore that his cures worked. By the mid-1950s, Hoxsey was operating seventeen clinics across the country, and he was charging patients $460 per treatment. He kept on offering his pills until his death from pancreatic cancer in 1973.
Hoxsey's methods, however valid or invalid, serve to illustrate the ongoing debate between those who advocate mainstream medical treatment and those who favor the use of herbs and other natural substances to prevent and cure disease.
GERMS: WHAT THEY ARE, AND HOW TO DESTROY THEM
If the general health concerns of Americans during the 1950s could be reduced to a single word, it probably would be germs, or disease-causing microorganisms. However, the average person really did not know much about germs except that they sometimes were passed from one person to another, resulting in the spreading of disease.
The two most feared germs are bacteria (one-celled microorganisms) and viruses (ultramicroscopic or submicroscopic agents). During the decade, researchers made great strides in the understanding and control of these tiny enemies. In 1947, sixty viruses were thought to contribute to human disease; by 1959, seventy-six new ones had been identified. During the 1950s the viruses that cause polio and measles, for example, were actually photographed for the first time. This enabled scientists to learn more about their structure and even manufacture them from materials in their laboratories.
At this time, it was believed that the ultimate remedy for illness lay in shots and pills, particularly those that contained antibiotics (chemical substances, produced by microorganisms, that can stop the growth of or completely destroy bacteria). In 1950, penicillin, the antibiotic fungus discovered by Alexander Fleming (1881–1955) in 1928, was artificially produced for the first time. This breakthrough resulted in the widespread use of penicillin to combat bacterial infections. Other antibiotics were developed during the decade including streptomycin, aureomycin, and neomycin. But as effective as antibiotics were against bacterial diseases, they were powerless against viruses. During the decade, scientists developed vaccines that provided protection against viruses that caused such diseases as polio, measles, and flu, but they could only prevent infection if they were given in advance of the onset. Once an unvaccinated person had contracted a viral disease, vaccines could not minimize the effects of the disease, as antibiotics could with bacterial diseases.
HEART DISEASE: AMAZING NEW TREATMENTS
In the 1950s, half of all deaths in the United States were caused by heart disease: heart-related ailments that at the time were said to include over twenty maladies, including arteriosclerosis (hardening of the arteries around the heart) and hypertension (high blood pressure). A major type of heart disease is the inability of vessels to deliver blood to parts of the body. When the defective vessels are in the heart, the result is a heart attack. When they are in the neck or brain, the result is a stroke.
Heart disease became more treatable during the 1950s. Surgeons were performing what seemed like astounding, mind-boggling heart-related surgery. In Philadelphia, an eleven-year-old girl was placed in a freezer to lower her body temperature to eighty-eight degrees, thus allowing doctors to stop her blood flow for five minutes while they closed a hole in the wall of her heart. In Washington, D.C., surgeons implanted an artificial mechanical valve similar to a plumbing device in the heart of a thirty-yearold woman, restoring her to health. By the end of the 1950s, open-heart surgery was being performed regularly. A few years earlier, the success rate of such operations had been approximately 40 percent. The University of Minnesota reported that, at the close of the 1950s, just 2.5 percent of patients died as a result of the surgery.
The President Has a Heart Attack!
Americans were never more aware of the perils of heart disease than on September 24, 1955, when President Dwight Eisenhower (1890–1969) suffered a massive heart attack.
He was vacationing in Denver, Colorado, and had put in a full day of golf on the previous day. In the aftermath of the attack, America momentarily panicked. Stock prices plummeted at first, but quickly regained their losses. The president soon recovered, and he returned to the White House on November 11 of that year.
Of his heart attack, Eisenhower later observed, "I had thrust upon me the unpleasant fact that I was indeed a sick man."
Electric shock began to be used to control heartbeats and revive lifeless hearts. In 1952, a patient who died during surgery was brought back to life by administering electric shocks to her heart. That same year, the electric heart pacemaker was developed. Initially, pacemakers were about twice the size of a pack of cigarettes. Surgeons placed them under the skin and connected them to the heart by electrical leads. The device put forth controlled electric shocks, which stimulated a regular heartbeat. Batteries had to be changed periodically, but that required only minor surgery.
Some heart-related surgical procedures were still in their exploratory stages. In New York, a flap of skin from the chest wall of a dog was sewn to the surface of his heart so that the blood supply from the skin flap could revitalize a heart crippled by blocked coronary arteries. In Chicago, a healthy heart was transplanted from one dog to another. It kept beating for forty-eight minutes, suggesting to heart specialists that animal hearts might be able to keep humans alive during extensive surgery.
Also during the decade, inroads were made in the area of heart attack prevention. Researchers explored the effects of hypertension (high blood pressure) and levels of cholesterol (a substance found among the fats in the bloodstream and in all of the body's cells) on heart disease. Doctors routinely advised their patients to avoid too much salt, which caused hypertension. High-fat diets also were identified as unhealthy.
"MIRACLE CURES" AND GREED: ANYTHING FOR PROFIT
With the rising popularity of television during the 1950s came an increase in advertising that directly influenced the manner in which consumers chose the products they purchased. It was one thing for advertisers to claim that a certain kind of cereal was tastier, a particular brand of bathroom tissue was fluffier, or a specific toy was more fun than a competitor's plaything. It was quite another to declare that a medicine might cure an ailment when such claims could not be proven scientifically.
As long as products have been advertised, some companies that manufacture them have attempted to make outrageous claims about their effectiveness. In March 1950, the Federal Trade Commission (FTC) charged the makers of the popular medicines Inhiston, Anahist, Resistabs, and Kripton with false advertising for their overblown claims of effectiveness against the common cold: a malady whose cure remains elusive decades later. In 1956, Consumer Reports surveyed cold remedies and reported on their usefulness. Many (including laxatives, lemon drinks, and special diets) were of no medical value whatsoever. Others (such as nose drops, aspirin, and inhalers) offered only short-term relief. The best recommended course of action then was bed-rest, which prevented complications and kept the cold sufferer from infecting others.
In 1953, the American Medical Association (AMA) and the National Association of Radio and Television Broadcasters established a code which prohibited actors from playing doctors without being identified as actors, and restricted overuse of such words as "harmless," "safe," and "advertising material which describes or dramatizes distress." Nevertheless, drug manufacturers kept testing the limits of their advertising pitches.
Some of the dubious claims made by drug advertisers during the 1950s included:
"Infra-Rub speeds up the flow of fresh, rich blood, thus helps drive away pain-causing pressure."
"This doctor's discovery is called Sustamin 2-12. Doctors of three leading hospitals personally witnessed amazing results. They saw agonizing, crippling pain relieved day and night."
"Javitol contains 85 percent choice coffee blends, combined with a vegetable extract that lets you literally drink that extra weight right off your body. Can you imagine?"
Greed and exploitation relating to health care were not confined to the advertising profession. In May 1957, a seven-year-old Long Island, New York, boy fell into a 21-foot-deep, 10-inch-wide well being dug for irrigation by his father. Fire fighters, police officers, and construction workers toiled feverishly to save him. After nineteen hours, the attending physician admitted that there was little hope for the boy's recovery. But the doctor was wrong. Five hours later, a construction worker lifted the boy out of the well. Once he was out of danger, the doctor presented his family with a $1,500 bill. People were furious. The boy's truck driver father earned $62 per week; his mother, a telephone operator, earned $42.
The doctor eventually withdrew the bill. Yet the incident served as an example of why Americans increasingly viewed those in the medical profession as impersonal and uncaring, and only concerned with collecting their rapidly rising fees.
MENTAL ILLNESS: A LACK OF COMPASSION
On an average day in 1959, approximately eight hundred thousand Americans were in mental hospitals. Some never would leave. Indeed, such hospitals were little more than overcrowded warehouses where troubled individuals wasted away as they waited to die. Yet in mid-decade, there were only forty-seven hundred certified psychiatrists in the country, and only five hundred new ones were being trained each year.
In October and November 1956, The Saturday Evening Post, a popular national magazine, printed a six-part series on mental hospitals in America. The focus of the story was on one that appeared to be typical: Columbus State Hospital, formerly known as the Central Ohio Lunatic Asylum. Modern sensitivity may have demanded that the name be changed, but the sprawling facility was woefully understaffed. Of its 2,700 patients, just 385 were receiving therapy or treatment. Fourteen hundred were judged to be candidates for treatment if resources ever became available. The remaining nine hundred patients were labeled "custodial," meaning that they would be given minimal attention until they died.
Because it was inexpensive, the preferred form of treatment for many new patients was electroshock therapy, or EST. During EST, electric current is passed through the brain to induce convulsions that eventually have a calming effect on the patient. As the Columbus State Hospital's superintendent explained, "EST is our mainstay."
Wealthier mental patients were placed in private hospitals, where they received more individual psychotherapy. However, such hospital stays cost up to $1,600 per month, a hefty sum in the 1950s. At the time, only 2 percent of all mental hospital patients were in private institutions.
POLIO: DESTROYING A DEADLY DISEASE
In the early 1950s, poliomyelitis (commonly known as polio) was a dreaded disease: a virus that caused a swelling of the gray matter of the spinal cord, resulting in the paralysis (inability to move or feel sensation) of different groups of muscles. President Franklin D. Roosevelt (1882–1945) had been stricken with the disease as an adult. However, its primary victims were children; thus, the disease also was known as "infantile paralysis." During the early years of the decade, polio reached epidemic proportions. In 1953, the National Foundation of Infantile Paralysis (NFIP), which provided funding for polio research, education, and patient assistance, announced that more polio cases had been reported during the past five years than in the previous twenty. In 1950, 28,386 severe cases were reported. By 1952, the number had risen to 55,000.
Polio is caused by the entry into the body of any one of three types of viruses. The virus goes into the body through the mouth and briefly resides in the bloodstream before taking one of two routes. If the patient is lucky, the virus makes its way into the bowels and subsequently is expelled from the body. In a less fortunate victim, the virus travels into the central nervous system where it damages cells in the brain stem or spinal cord. In these cases, the result is severe paralysis and sometimes death.
In the early 1950s, widespread rumors circulated regarding how polio originated and spread. The NFIP put the word out that fruit, insects, animals, and bad genes did not cause polio. Because polio tended to strike during the summer, the NFIP suggested that parents should send their
children only to summer camps that offered proper medical supervision. Parents were advised not to allow children to mingle in crowds. While swimming in public pools did not result in polio, crowds at pools might increase the risk of transmission. As a result of this information, countless public swimming pools were closed. Boys and girls often were confined to their yards during the summer and in other seasons in regions where the climate was warmer. Youngsters were encouraged to play quietly, because sweating was thought to promote polio. They were discouraged from playing with anyone but their closest friends.
Jonas Salk (1914–1995) was propelled to international acclaim when he developed the first successful polio vaccine in the early 1950s. In the late 1940s, Salk became director of the University of Pittsburgh School of Medicine's Virus Research Laboratory. During the course of a three-year project sponsored by the NFIP, Salk demonstrated the existence of three types of polioviruses. He determined that in order to be effective, a vaccine must work against all three. He then began working to develop such a vaccine, using dead viruses suspended in mineral oil and formaldehyde. Tests of the vaccine Salk created took place in 1953 and 1954. During 1954, 1.8 million schoolchildren were given the vaccine; they came to be known as "polio pioneers." Over 440,000 "pioneers" were given the real vaccine, which was injected into the bloodstream, while 210,000 were given placebos (dummy shots); the remaining children were observed as a control group. The tests were successful. Those who received the placebo contracted 3.5 times more cases of polio than those who had been given the vaccine. As a result, the mass immunization of American children began in 1955.
By the end of 1958, 200 million shots of what had become popularly known as the Salk vaccine had been given to Americans, starting with first graders. However, its benefits were limited. It immunized against polio for only about thirty months, at which point a second shot was required. Before the decade ended, Albert Sabin (1906–1993) developed a more effective vaccine: a "live-virus" one, made up of viruses that were too weak to cause the disease but strong enough to stimulate the human body to react against it for a longer time. Sabin first tested his vaccine on thirty prisoner volunteers in 1955. However, the thinking at the time was that the only good virus was a dead virus. So Sabin accepted the invitation of scientists from the Soviet Union to further test his vaccine in Russia. He successfully vaccinated millions of Soviet schoolchildren and, by the early 1960s, his vaccine had completely replaced the Salk vaccine. It provided nearly lifetime immunity from the disease. By the 1980s, fewer than ten cases of polio were reported in the United States each year.
Not surprisingly, children preferred the Sabin vaccine over the Salk version for one reason: It was taken orally, rather than by injection!
RADIATION: A NEW WAY OF SEEING AND HEALING
One of the side benefits of nuclear research done during the 1950s in the name of national defense was that scientists came to understand the manner in which radiation works. Radiation is the process by which energy in the form of heat or light is emitted from molecules and atoms as they undergo internal transformation. Those researching the process and its effect discovered that controlled, low levels of radiation could be used for medical purposes.
Radiation was employed to assist in diagnosing cancer. Scientists observed that cancerous tumors absorbed radioactive material (substances that give off radiation) several times more readily than normal tissue. A patient who was suspected of having cancer was asked to swallow capsules filled with low-level radioactive material. After allowing time for digestion, a Geiger counter (an instrument that measures levels of radiation)
was used on the outside of the abdomen. If the Geiger counter recorded unusually high levels of radioactivity, it was assumed that the reaction came from a cancerous tumor.
Radioactivity also was utilized in the healing process. In 1953, scientists unveiled a cobalt-ray machine on national television. This apparatus could send massive doses of radiation to a focused spot on a patient's lung, killing the targeted cancerous tissue. But because radiation treatments also destroy some healthy body cells, many people were concerned that treatments might be as deadly as they were beneficial.
SHAPING UP AMERICA'S YOUTH
According to the results of a 1957 study testing the "minimum muscular fitness" of American children, over half of the nation's young people were not physically fit. When the same test was administered to a group of European students, more than 90 percent passed.
President Dwight Eisenhower (1890–1969), a former U.S. Army general, considered the nation's physical fitness a key to national security. To address what he labeled the "fitness gap," he created the President's Council on Youth Fitness. The council advised schools and communities to provide additional opportunities for organized sports and outdoor physical activity.
The irony was that the decline in American fitness was a direct result of the country's progress. Because of labor-saving devices found in many American homes, children had fewer and less-strenuous chores; for example, dishwashers and washing machines and dryers now automatically cleaned and dried dishes and clothes. Increased automobile traffic often made walking or bicycling to school too dangerous, and children were likely to ride school buses in many fast-growing communities. The growth of cities and suburbs led to a decline in space for children to play. Finally, children increasingly preferred to remain inside and watch television rather than go outside and hit baseballs or jump rope.
TRANQUILIZERS: SIMPLE SOLUTIONS FOR COMPLEX PROBLEMS?
During the 1950s, America became a nation of pill-poppers. As scientists discovered new drugs to combat illness, Americans came to believe that pilltaking was a simple and effective means of feeling better. Among the era's most popular pills were tranquilizers, whose ingredients slowed down the action of the central nervous system and thus reduced nervous tension and anxiety. In 1956 alone, doctors wrote thirty-five million prescriptions for tranquilizers; a rate of one every second. Anxious patients struggling to cope with the stresses of modern life paid $150 million to get their pills.
One of the decade's most popular tranquilizers was called Miltown. It was developed by the drug company Ludwig and Piech; a competing company, Wyeth Laboratories, marketed the tranquilizer under the name Equinal. The pill was touted as a nonhabit-forming cure for anxiety and nervous tension. It achieved its effect approximately forty-five minutes after intake. The result was a satisfying sensation caused by muscle relaxation.
When Miltown and Equinal first were marketed in 1955, they were instant successes. Both were prescribed for illnesses ranging from alcoholism to insomnia, in addition to simple nervous tension. However, some in the medical profession were concerned that these tranquilizers were being prescribed too casually and haphazardly. In 1957, Time magazine reported that a woman in Beverly Hills, California, asked her doctor to prescribe tranquilizers for her daughter "who needed them to get through the trying first week of her honeymoon." A woman in Boston asked her pharmacist for a bottle of "happiness pills." The following year, the president of the American Medical Association (AMA) warned that "modern man cannot solve his problems of daily living with a pill."
TUBERCULOSIS: THE DEMISE OF A DISEASE
In the 1950s, tuberculosis (TB) had long been a frightening and deadly disease. TB is a bacterial disease that affects the lungs and often results from unsanitary living conditions. Because tuberculosis is highly contagious, victims were isolated in special hospitals called sanatoriums. The death rate from the disease in 1950 was only 11 percent of what it had been a half-century earlier. Still, 33,633 people died of tuberculosis that year. The discovery of antibiotics to combat the disease helped to further reduce its death rate. Furthermore, a blood test was developed early in the decade that accurately detected the disease in its early stages.
By mid-decade, the number of deaths from tuberculosis had been cut in half, and as the years passed, they kept steadily decreasing. All of this signified the beginning of the end of TB as a formidable public health problem.