The 1940s Lifestyles and Social Trends: Topics in the News
The 1940s Lifestyles and Social Trends: Topics in the News
AMERICANS EMBRACE THE INTERNATIONAL STYLE IN ARCHITECTUREFASHION GOES TO WAR
WAR TRANSFORMS THE FAMILY
WAR TRIGGERS RELIGIOUS REVIVAL
AMERICANS MOVE OUT OF TOWN
AFRICAN AMERICANS EXPECT MORE
AMERICANS EMBRACE THE INTERNATIONAL STYLE IN ARCHITECTURE
So-called modern architecture was popular in Europe during the 1930s, but it did not arrive in the United States until World War II (1939–45). The modern style favored clean, functional lines, and used construction materials such as steel and glass to create flat, machine-like surfaces. Because it was used around the world, this kind of architecture became known as the International Style.
One reason International Style architecture arrived in the United States in the late 1930s and early 1940s was immigration. Many modernist architects fled to the United States in the 1930s after Adolf Hitler (1889–1945) came to power in Germany. Walter Gropius (1883–1969) became dean of the architecture school at Harvard University, and hired his colleague from Germany, Marcel Breuer (1902–1981). Another colleague of Gropius, László Moholy-Nagy (1895–1946), taught at the Institute of Design in Chicago, while Eero Saarinen (1910–1961) taught at Cranbrook Academy of Art in Bloomfield Hills, Michigan.
The International Style adapted itself to the American setting and changed in the process. Alvar Aalto (1898–1976), who worked at the Massachusetts Institute of Technology, built a red-brick dormitory building known as the Baker House (1948), which curved along the Charles River. More radical than this was the Glass House (1949), designed by American-born Philip Johnson (1906–); the building's walls were made entirely of glass. Despite its rigid, squared-off lines, the Glass House blended in well with the surrounding landscape of New Canaan, Connecticut.
But the architect who did the most to define the style of the 1940s was Ludwig Mies van der Rohe (1886–1969). He arrived in the United States in 1937, and pioneered techniques of building a thin "skin" of glass over a steel or concrete "skeleton" of the building's structure. His first stand-alone buildings in the United States were the Promontory Apartments (1948–49), and apartment towers at 860-880 Lake Shore Drive in Chicago. Both of these buildings used ribbons of glass over a steel frame. This version of the International Style, known as Miesian architecture, became a symbol of corporate power. In Portland, Oregon, the Equitable Savings and Loan building (1948) used a skin of polished aluminum over a reinforced concrete frame, while Saarinen's General Motors Technical Center in Warren, Michigan (1948–65), drew on the design of Mies van der Rohe's buildings on the campus of the Illinois Institute of Technology (1939–41).
Besides the arrival of European architects, World War II had produced other effects on American architecture. During the Great Depression many architects struggled to find work, and the situation did not improve in the early 1940s. In April 1942, the War Production Board (WPB) issued Order L-41, ending all construction other than that of buildings essential to the war effort. The Supply Priorities and Allocations Board (SPAB) put strict limits on the materials that could be used. The "skin-over-skeleton" techniques favored by International Style architects proved useful for low-cost housing projects and for military bases. Channel Heights in San Diego, California (1943), by Richard Neutra (1892–1970), and Aluminum City Terrace in New Kensington, Pennsylvania (1941), by Gropius and Breuer, are among the most distinguished examples of this spartan modern style.
Architects also were employed in wartime, designing factories, temporary housing for refugees in Europe and for military troops abroad, and detention camps. They designed housing delivered in ready-made parts that could be bolted together on site. One such building was based on a circular sheet-steel grain bin. Known popularly as "Igloos," these houses were easily transported and very strong. It was predicted that when the war ended, four hundred thousand new houses would be needed immediately, with demand rising to one million a year by the late 1940s. Some commentators expected "Igloos" to become a common sight in American cities. After difficult times in the 1930s and early 1940s, American architects in the second half of the decade found themselves in great demand.
FASHION GOES TO WAR
World War II had a dramatic effect on American fashion. Restrictions on materials such as silk, nylon, rubber, and leather changed the way clothes were made and the way they were worn. Government Order L-85, announced in 1943, stated that a maximum of one-and-three-fourth yards of fabric could be used in a dress. The only clothing styles unaffected by L-85 were wedding dresses, religious vestments, infant wear, and maternity wear. As supplies of fabric, buttons, and fasteners disappeared, American fashion was forced to change.
For men, shortages of fabrics meant that civilian suits became simpler. Gone was the double-breasted, three-piece suit of the 1930s. Instead, men in the 1940s wore the two-piece suit, with narrower lapels on the jacket and no cuffs or pleats in the pants. But in 1941, men's clothing manufacturers limited their production to military uniforms almost exclusively. Before long it was difficult to find a suit of any kind in stores. Even after the war ended, clothes for men remained scarce because of the overwhelming demand caused by millions of soldiers returning to civilian society.
The effect of Order L-85 on women's clothes was also dramatic. Fabric shortages led to narrower waistlines, shorter skirts, and blouses without
cuffs or pockets. Scarves and hoods were expensive and difficult to justify. For both women and men, blends such as rayon gabardine became the most common wool substitute. Women suffered most from restrictions on silk and nylon, the materials used to make stockings. Japan slashed its exports of silk in the late 1930s, and when war broke out supplies stopped altogether. Nylon, the alternative material to silk, first appeared in 1939. But within a few months of the first retail sales of nylon stockings, wholesale supplies of nylon were commandeered by the government.
Most of the garment industry's effort during wartime went into making military uniforms. Uniforms had to be durable, quick-drying, and adaptable. In the tropics, soldiers wore loose cotton shirts, while nylon netting helped protect them from insects. In Europe, American troops wore a uniform consisting of an olive-colored wool service shirt and trousers with a water-repellent M43 field jacket. Boots were high-laced, with a buckle ankle flap to protect the trouser bottoms in wet conditions. Foot soldiers could wear out a pair of boots in a month, causing a shortage of leather at home. There were tight restrictions on the amount of leather that could be used in civilian shoes.
The League of Broke Husbands
The New Look was highly popular after the drab fashions of the war years. But not all women thought it was a good thing. Twenty-four-year old Bobbie Woodward of Dallas did not want to hide her legs as the new fashions required. And she did not see why her husband should have to buy her a complete new wardrobe. In August 1947, she founded the Little Below-the-Knee (LBK) club. LBK branches sprang up across the country. Its members picketed America's downtown stores, chanting "The Alamo fell, but our hemlines will not!" The high prices of New Look fashions annoyed men, too. One group formed the League of Broke Husbands to protest the way the fashion industry was manipulating them into spending their hard-earned money.
Women served in the military as nurses, clerical workers, pilots, and coast-guard recruits. Images of women in uniform filled the media. Six million women also went to work in war industries, such as ship and aircraft manufacturing. Most women in these jobs wore the same clothes as their male counterparts. But at Boeing, fashion designer Muriel King (1900–1977) designed special overalls for women. Boeing liked the overalls because they used a minimum of fabric and were therefore cheap. Workers liked the "slimming waistlines" and flattering cut. After work, women went back to wearing more feminine clothes, including belted dresses in printed fabrics.
During the war, designers made sticking to supply limits a matter of national pride. But after 1945, clothes became more luxurious. In spring 1947, French designer Christian Dior (1905–1957) introduced his "New Look," a style that has come to define the fashion of the late 1940s and the 1950s. Skirts were longer and fuller, with a "wasp" waistline. Soft, full collars and sleeves, sloping shoulder lines, and padded bras emphasized the feminine shape. While other designers such as Hollywood's Gilbert Adrian (1903–1959) created practical but feminine clothes, Dior's designs pushed women back into their traditional roles. For men, too, the postwar years brought a return to tradition. Emphasizing men's return to the workplace, the gray flannel suit was the dominant male fashion of the late 1940s. With its three-button jacket, small lapels, and flat-front pants, the gray flannel suit also marked the rise of corporate America in the postwar world.
WAR TRANSFORMS THE FAMILY
Around six million women joined the American labor force during the war. They took the place of men who had been drafted into the military. In the mid-1930s, 80 percent of Americans disapproved of women working outside the home. By 1942, only 42 percent saw it as a negative development. Sixty-nine percent of married working women wanted to keep their jobs after the war despite the fact that they were paid less than men. After the Japanese attack on Pearl Harbor thousands of couples applied for marriage licenses. In 1942, the government even encouraged couples to marry by providing a "family allotment" of extra pay for families of men in the military. Separation brought obvious problems for couples and their young families, spurring a desire for security and comfort in the postwar years. The government's position was that, in peacetime, a woman's place was in the home. Sociologist Willard Waller (1899–1945) argued that mothers who worked were creating a generation of juvenile delinquents. He suggested that the independent woman was "out of hand."
Children, as well as adults, were affected by the war. For one thing, there were many more of them in America than ever before. Couples often conceived a "goodbye baby" before the man went off to fight, so that by 1943 the U.S. birthrate was at a sixteen-year peak. It was the beginning of a "baby boom," which lasted until the early 1960s. In the long run, children usually adapted well to having absent fathers, but they often suffered anxiety and uncertainty in the short term. Sixteen million men were separated from their families by the war, and many children
did not recognize their fathers when they returned from overseas. Older children often left school early to go to work in factories, while labor laws were relaxed during the war to allow minors to go to work. A U.S. Census Bureau survey in 1944 found that one-fifth of all boys aged fourteen and fifteen were gainfully employed. A third of American girls between the ages sixteen and eighteen also had jobs. Most of these children managed to combine factory work with schooling. The traditional American family had changed forever.
Youth Culture
Youth culture blossomed in the "Roaring Twenties," but went into decline during the hardships of the Great Depression. It reemerged after 1945, when parents and teenagers had more money to spend on recreation. Postwar teenagers had more freedom than their predecessors, but the amount of freedom they had varied with social class. Poor black teenagers in the South had more freedom in their social lives, especially with regard to sex, than black adolescents from wealthier homes. Teenagers who had dropped out of high school were able to meet in dance halls, bowling alleys, and skating rinks. They were more sexually active than high school students. Even so, dating was an important part of the high school student's social life and was governed by a complex set of rules. Adults tried to control the way teenagers behaved on dates, especially their sexual activity. One manual for teenagers recommended topics of conversation for a date. It included the suggestion "Talk about animals: 'My dog has fleas - what'll I do?'"
For men returning from the war, the idea of their wives or girlfriends going out to work was often difficult to handle. Many had married just before going off to fight, and they had spent their time in the military dreaming of a return to traditional family life. Some had suffered physical and psychological trauma on the battlefield and found civilian life difficult. In 1946 the Veterans' Administration hospitals treated almost twenty thousand veterans for psychiatric problems. Many more were treated in regular hospitals or suffered without help. Veterans often found that returning to work in large, impersonal corporations was in many ways similar to serving in the army. But for most men, there was little opportunity for self-fulfillment at work. In the late 1940s, the family seemed to be the only area where male authority remained in place. Yet even there, traditional gender roles had begun to change. Middle-class women, who were less likely to go out to work, were better able than others to affirm the importance of their husbands' role as "breadwinner."
WAR TRIGGERS RELIGIOUS REVIVAL
During the Great Depression church attendance in the United States fell steadily. In 1939, only about 43 percent of Americans were regular churchgoers. At the end of World War II in 1945, however, American churches saw a dramatic rise in their membership. By 1950, 55 percent of Americans belonged to a religious group. Between 1945 and 1949, three hundred thousand new members joined the Baptist congregation, while the Catholic Church baptized one million new babies every year. The reasons for the religious revival were complex. The experience of war left many Americans searching for some meaning to life. But the growing affluence of American society also led to social pressures. Going to church was often seen as a sign that one was a trustworthy member of the community.
The war caused problems for churches and religious leaders. Many clergymen signed up for military service, often combining their duties as chaplain with counseling of troops to help them through the ordeal of battle. At home, clergy put a huge effort into providing aid and comfort to families whose loved ones had gone off to war. Jewish Rabbi Stephen Wise (1874–1949) and Christian theologian Reinhold Niebuhr (1892–1971) advised government officials regarding decisions of life and death. At a time when God seemed to have left humanity to destroy itself, the faith and advice of such religious leaders helped sustain the nation's courage.
Not all clergy were in favor of the war, however. Pacifist A.J. Muste (1885–1967) believed there was no justification for the violence and destructiveness of war. Muste led a small group of pacifists who believed in nonviolent methods of solving the world's problems. Pacifists never gained large-scale support, but most religious groups did become more tolerant and liberal in the postwar years. Even Protestant revivalists, led by Billy Graham (1918–) in 1949, preached tolerance and humility. Although Graham's anticommunist views placed him in the politically conservative camp, his approach was more accepting of others' beliefs than it might have been a decade earlier.
There were three major religious groups in the United States in the 1940s. By the end of the decade the Catholic Church had twenty-five million communicants (active members), primarily centered in the Northeast. Catholics nationwide were far outnumbered by Protestants, however. Put together, the Protestant churches were the dominant religious organizations in America during the 1940s. The largest single Protestant church, the Methodists, had eight million members and an annual budget of $200 million. The second-largest Protestant church was the Southern Baptist Convention, with six million members. The third major religious faith in the United States was Judaism. The five million American Jews divided themselves into three branches: Orthodox, Conservative, and Reform. Of the nations directly involved in fighting World War II, only the United States saw membership in its religious groups grow when the war ended.
Although religious observance grew after 1945, the nation's newly found affluence presented a challenge to churches and their leaders. For many people in the suburbs, going to church was a lifestyle choice rather than a sign of religious belief. Church attendance marked a person as a noncommunist and someone who could be trusted to conform. As Americans moved to the new suburbs, many found themselves living next door to people of different faiths. Religious leaders such as H. Richard Niebuhr, brother of theologian Reinhold Niebuhr, worried that American religions were merging as people borrowed from one another's rituals and beliefs. During the postwar years, there was an unprecedented growth in religious activity and in tolerance between faiths. Religious groups become less distinctive and, some thought, less relevant to modern American life.
AMERICANS MOVE OUT OF TOWN
As the economy expanded after 1945, more people could afford to buy their own homes. More importantly, the GI Bill gave veterans access to low-interest housing loans. But the available supply of good-quality housing was not enough to meet postwar demand. Developers such as Abraham Levitt (1880–1960) and his son, William Levitt (1907–1994), began building housing on the outskirts of cities. Levitt's simple, low-cost houses were built on vast land tracts, which soon became known as Levittown. During the second half of the 1940s, there was a record-breaking boom in house building and the arrival of suburban living on a huge scale in America.
Most of the new homes built in the 1940s were located in suburbs. For middle-class white families, a suburban home with its picture window, small plot of land, and quiet location offered the good life promised by the American dream. By 1946, for the first time, most Americans lived in houses that they owned. Suburban houses were comfortable and self-contained. Designed for mothers to stay at home to look after their children, these houses were far away from dangerous urban streets. New household appliances, such as washing machines and vacuum cleaners, made it easier to deal with the housework. The backyard was ideal for hosting barbecues and other social activities. But for some Americans, suburban living was an impossible dream. Until 1948, the Federal Housing Authority (FHA) refused to finance houses for black families. Even when the FHA rule was found to be unconstitutional, blacks were still kept out of suburbia. It was almost impossible for African Americans to save enough money to buy a home. And even when they could, they were often unwelcome in white suburban neighborhoods.
The demand for suburban comforts and the scale of suburban growth were astonishing. In 1944, construction on 114,000 new houses was begun in the United States. By 1950, the number of yearly housing starts had reached a peak of 1,692,000. In some cases, people slept outside developers' offices to make sure to get the house they wanted. In early 1949, the developer Abraham Levitt (1880–1960) wrote to veterans who had applied for his $7,999 Levitt houses. The first 350 people in line on Monday, March 7, would get a home for a $90 down payment and $58 a month. Veterans began showing up outside Levitt's office on Friday night, March 4. Over the weekend, police had to control thousands of ex-GIs desperate for a new home. Levitt estimated that he could have sold at least 2,000 houses.
Automania
Developments in automobiles were largely halted by World War II. But the war led to new technologies and designs that were incorporated into the American automobile of the late 1940s. Improvements to the V-8 engine, developed in the 1930s, allowed American cars to have accessories like power brakes and power steering, electrically operated windows, and air conditioning. In 1946, Buick declared that it aimed "to make those Buicks the returning warriors have dreamed about.… " Postwar cars were wider, lower, and more comfortable to ride in. Innovations such as convertibles with detachable steel roofs, better-organized instrument panels, and sleek shapes made new cars desirable to residents of the expanding suburbs. Cars also became more affordable. The cheapest, no-frills European imports, such as the British Ford, cost $1,570. American Ford offered Lincoln models priced from $2,500 up to $4,800 for the top-of-the-line Continental. Suburban life was impossible without an automobile, so as the suburbs grew, car sales also exploded. In 1945, there were 25.8 million cars registered in the United States. Between 1946 and 1950, 21.4 million new cars arrived on American roads.
Suburban life had a dramatic effect on American families and their spending patterns. In the years after the war, couples began to marry and have children at younger ages than before. They competed with their sub-urban neighbors to obtain the latest household appliances, the right car, and invitations to social events. This competition was seen as an investment in family life rather than as greed or conspicuous consumption. But there is no doubt that the pressure to conform to the consumer culture was very strong. The expansion of the suburbs, and the demand for consumer goods it triggered, was a major factor in the economic boom that followed World War II.
AFRICAN AMERICANS EXPECT MORE
In the 1940s, race relations in the United States went through a period of confusion and transition. On the one hand, after Pearl Harbor, Japanese Americans were subjected to racist taunts and ill treatment. Japanese American citizens on the West Coast were held in internment camps, and their property was sold off at below-market prices. Yet at the same time, African Americans experienced a gradual improvement in their situation. Executive Order 8802, signed by President Franklin D. Roosevelt on June 25, 1941, banned racial discrimination in the defense industries and in the federal government. It also set up the Fair Employment Practice Committee (FEPC) to investigate discrimination. Roosevelt's order was the first positive step by the federal government to ensure equal treatment for blacks and whites in the workplace.
Most African Americans in the defense industries worked in lowskilled jobs. But the number in semiskilled and skilled positions began to rise. Many moved north to work in factories and found a better standard of living. These blacks were also given a voice in politics because there were fewer restrictions on who could vote in the northern states. For example, in the South, voters were required to take a test to see if they could read the ballot. Because blacks had a lower standard of education, they were more likely to fail the test and thus be unable to vote. No such testing existed in the North. Poll taxes (a tax that gave voters the right to vote) also did not exist in the North. Because Southern blacks were among the poorest residents, they were less likely to be able to pay the poll tax, and therefore often were excluded from voting in the South.
Despite Executive Order 8802, racial tension did not disappear overnight. In the military, blacks found it almost impossible to win promotion. Though opportunities improved during World War II, blacks and whites were kept apart (segregated) on many military bases. White and black soldiers often had conflicts. In cities, where many black and white workers were working together for the first time, violence erupted. In June 1943, black and white youths clashed in Detroit. As many as five thousand people rioted over three days. Twenty-five blacks and nine whites were killed, and more than seven hundred people were injured. The 1943 Detroit riot remains one of the worst race riots in American history.
The discovery of Nazi atrocities against minority groups in the early 1940s prompted a rethinking of American attitudes toward race. As the economy boomed after 1945, African Americans resented their exclusion from many postwar jobs and the lower pay they still received. Many whites agreed with them. Roosevelt's FEPC did not become a permanent commission after 1945. But on December 5, 1946, President Harry S Truman established the President's Committee on Civil Rights. On February 2, 1948, Truman delivered the first-ever presidential civil rights address to Congress. Southern Democrats blocked the passage of legislation to ban poll taxes, set up a federal antilynching law, and make the FEPC permanent. But Truman's committee paved the way for the civil rights struggles and victories of the 1950s and 1960s.