The Second Phase of the Industrial Revolution: 1850–1940
The Second Phase of the Industrial Revolution: 1850–1940
The practices of using engines as substitutes for animal and human muscle power and of using machines to produce goods took on a different character after about 1850. Sometimes called the second Industrial Revolution (or the second phase of the Industrial Revolution), this new phase differed from the original in several ways, and marked an important shift in the progress of the revolution.
With the rapid spread of the Industrial Revolution from Great Britain to the United States and Europe came a wave of inventions, some of which were new, many of which simply improved upon existing machines. Advances in science, particularly in chemistry, led to widespread changes, especially in agriculture and medicine. Petroleum became an important source of energy, leading to a new class of mobile machines (notably automobiles and trucks). Electricity was developed into a new means of delivering energy, leading to the introduction of small motors as well as superior lighting for both factories and houses. A new process of stringing together several inventions to create complex systems revolutionized manufacturing, transportation, and communications, and helped to create new business enterprises that were much larger than anything that had come before.
Taken together, these changes accelerated the impact of the Industrial Revolution on society throughout Europe and North America. Whereas everyday life for most people had changed relatively little from 1700 to 1800, it changed profoundly from 1800 to 1900 and beyond.
The United States: The stage is set for rapid growth
During the nineteenth century, the United States grew rapidly, from 5.3 million people in 1800 to 76.2 million in 1900. During the same period, the land area of the United States increased from 891,000 square miles (2,307,690 square kilometers) in 1791 to 3,021,295 square miles (7,825,154 square kilometers) in 1900 (excluding Alaska and Hawaii). The increase in size primarily was a result of the Louisiana Purchase (the acquisition by President Thomas Jefferson in 1803 of French-held lands in North America), the Texas war for independence from Mexico (1832–36), and war between the United States and Mexico (1846–48), which together expanded the area of the United States from a few states on the Atlantic coast to vast tracts of land stretching all the way to the Pacific Ocean.
The dual increases in population and land area provided the United States with a huge internal market for manufactured goods, as well as a large domestic supply of raw materials. The expansion encouraged immigration as well as westward migration. The task of feeding, clothing, and supplying the rapidly growing population with goods enabled the Industrial Revolution in the United States to reach heights unmatched by any other country.
In the first half of the nineteenth century, the growth of industry was concentrated in the Northeast. Agriculture continued to dominate in the South, although the southern states were affected by industrialization as northern textile mills demanded more and more supplies of raw materials. Industrialization also spawned divisions between North and South. Southern cotton planters were convinced that slave labor was required to keep their agricultural economy growing, even as slavery became a political and moral issue in the North. Manufacturers in the North favored tariffs, or taxes, to raise the price of imports, mostly from Britain, while Southern planters resented paying higher prices so that businesses in the North could benefit. Southern planters also did not like hurting some of the biggest customers, who were located in Britain. Settlers in the West were mainly interested in raising grain (especially wheat) and cattle, and they favored federal help in paying for the expansion of railroads that brought their crops to market.
British inventions leave home
British authorities had been quick to recognize the value of the early inventions that began to transform the British economy during the first stage of the Industrial Revolution. To try to maintain their advantage over other countries, the British government passed several laws that prohibited the export of both textile machines and the design plans for these machines. But the laws proved impossible to enforce, and even before 1800 British inventions began showing up in Europe and the United States. An Englishman named Samuel Slater (1768–1835) helped speed the industrialization process along in the United States.
Samuel Slater: The man with a memory for machinery
Samuel Slater was born in Belper, England, in 1768. At age fourteen he went to work for Jedediah Strutt, who owned a cotton mill in Belper with the highly successful businessman and inventor Richard Arkwright (1732–1792). Arkwright had developed the water frame, a machine for making thread from cotton. The water frame was powered by a waterwheel, which turned as water in a flowing river or stream pushed against slats in the wheel, putting the attached machinery in motion (see Chapter 3). Well versed in the operation and design of Arkwright's machinery, Slater eventually became the supervisor of machinery and mill construction at the mill.
To maintain its control over textile machinery, Britain made it illegal for textile workers to leave the country. Meanwhile, however, investors in the United States were offering rewards for anyone who could deliver plans for a textile mill. In 1789 Slater had decided that the British textile industry had reached its limits, and he thought he had a brighter future in the United States. He slipped into a disguise and left England on a two-month voyage, landing first in New York and later moving to Rhode Island. There a businessman named Moses Brown was one of those offering a reward for help in building a textile mill.
Improving on Arkwright's design, which Slater had memorized, Slater and Brown designed machinery to card (clean and untangle the fibers) and spin cotton into thread. The two men together established a textile mill, and in 1793 Slater built his own mill on the Blackstone River in Pawtucket, Rhode Island, a step widely regarded as the start of the Industrial Revolution in the United States. Slater not only knew how to design the machinery, he was also expert at running a mill. He eventually owned several cotton mills, and he went so far as to establish a new town called Slatersville, Rhode Island, a few miles northeast of the state capital, Providence.
There was no shortage of streams and rivers in New England to provide water power to run the frames, and within a decade of Slater's establishing his first mill, there were eighty mills operating in New England, firmly establishing the American Industrial Revolution. The fact that the mills were established in that region would eventually have enormous political implications for the United States: New England mills were put in competition with Britain for both the cotton grown on plantations in the southern states and for the sales of finished fabric.
In most ways, the Industrial Revolution in America had proceeded along the same basic path as in England. The United States had more of its own raw materials, such as cotton, and a large territory that it was populating with Europeans by driving out Native Americans. Britain had access to cheap raw materials and room to grow through its large overseas empire.
The spread of the idea of the Industrial Revolution was not limited to the United States, however. Business-people in Europe were also eager to take advantage of the wealth brought by the Industrial Revolution by copying machines they observed in Britain. These business owners in turn developed new machines, or improved upon the original concepts embodied in the English originals.
Striking oil! A new era begins
Demand for coal increased throughout the Industrial Revolution. It was the primary source of energy for the growing number of steam engines and, later, for electricity-generating plants. One factor in Britain's early success in industrialization was its plentiful supply of coal. The United States also had vast supplies of coal under the Appalachian Mountains of Pennsylvania and western Virginia (present-day West Virginia) and elsewhere. Other countries, such as France and Germany, had abundant supplies as well.
In 1859 Edwin L. Drake (1819–1880), a retired railroad conductor, made a momentous discovery in the town of Titusville, Pennsylvania: an underground deposit of liquid petroleum (oil). It was not the first time oil was noticed in North America. Native Americans had long used oil that seeped to the Earth's surface for medicine, and even for fuel. Early European explorers also found oil seeping from surface rocks or gathered in pools. The early frontiersman Kit Carson (1809–1868) sold oil to lubricate wagon wheels.
Discontinuous versus Continuous Inventions
In the first stage of the Industrial Revolution, the steam engine and new textile machines marked a drastic departure from the way work had been done for centuries. Steam-driven engines gave people a form of strength and power never before experienced. Textile machines enabled a single worker to produce vastly more thread and cloth than ever before.
The inventions that make possible dramatic changes are sometimes called "discontinuous." Instead of resulting in slight change or improvement along a steady path, discontinuous inventions seem to come from nowhere and profoundly change the world. A "continuous" invention is one that either improves upon an older invention, or uses an older invention in a new way. Whereas discontinuous inventions are rare, sets of small improvements to existing inventions for use in other applications are much more common.
The 1800s saw many continuous inventions. As the steam engine and spinning and weaving machines came into widespread use (see Chapters 2 and 3), England, Europe, and the United States entered into a new period in which small refinements were made to older inventions. In general, the second stage of the Industrial Revolution was marked by a large number of continuous inventions, rather than by discontinuous ones.
Drake's discovery marked the beginning of the American oil industry, and a sharp turning point in the Industrial Revolution. The fact that the United States, unlike other leading industrialized countries such as Britain, France, Belgium, and Germany, has some of the world's largest underground petroleum reservoirs is one reason for America's rapid industrial growth in the second half of the nineteenth century and into the twentieth.
What oil brought: The internal combustion engine
The development of the internal combustion engine coincided approximately with the discovery of the first oil wells in North America. After wells established a steady supply of oil, there came new techniques to refine (process) the crude oil into a variety of useful substances, including kerosene (burned in lamps), gasoline (as fuel for internal combustion engines), and, later, plastic. In the United States, the internal combustion engine became the major successor to the steam engine perfected by James Watt (1736–1819; see Chapter 2). It eventually changed the character of industrialization.
Internal combustion engines differ in a fundamental way from their predecessor, the steam engine. In a steam engine, coal is burned in a separate chamber outside the engine (external combustion) to convert its inherent energy into steam. Steam is inserted under the piston, where the steam's expansion pushes the piston (a solid, tubular piece of metal that slides up and down, or back and forth, inside a hollow cylinder) to the other end. As the steam cools, it contracts and creates a vacuum, which pulls the piston back down. In the internal combustion engine, the process is similar, but a mixture of gasoline and oxygen is squirted under the piston and ignited by an electric spark. The combustion (explosion) of the gasoline and oxygen push against the piston. In both engines, a metal rod attached to the piston transfers this motion, via a series of gears, to other types of motion, such as the circular motion of a vehicle's wheels. This process is repeated over and over, hundreds of times a minute; most internal combustion engines use several sets of cylinders and pistons (four, six, or eight in modern automobile engines, for example) to generate more power.
Internal combustion engines can be made much smaller and lighter than steam engines (as small as a lawn mower or chain saw, for example), partly because they do not require a coal fire heating water in order to create steam. They are also easier and quicker to start and stop (no need to start coal burning or to cool it down), and better suited for smaller-scale uses, such as automobiles and trucks. And because internal combustion engines are smaller and lighter, they can be used on smaller vehicles that can run virtually anywhere over roads, unlike very heavy steam-drive locomotives that require railroad tracks.
The idea of the internal combustion engine was not new in the 1800s. The Dutch scientist Christiaan Huygens (1629–1695) experimented with such an engine as early as 1680. But the real breakthrough came in 1859, when a French engineer, Jean-Joseph-Étienne Lenoir (1822–1900), built an internal combustion engine that could operate continuously. Lenoir's engine used coal gas (made by processing coal) as fuel. Three years later, another Frenchman, Alphonse-Eugène Beau de Rochas (1815–1893), patented another version of an internal combustion engine. In 1879 a German engineer, Nikolaus Otto (1832–1891), developed another successful engine known as the "Otto cycle." Some historians credit Otto as being the true inventor of the internal combustion engine.
The greatest jump forward in internal combustion technology occurred in 1889, when a German engineer, Gottlieb Daimler (1834–1900), developed an engine that resembles the motors in twenty-first-century cars. Daimler's engine was small compared to a steam engine, and it operated continuously, for as long as petroleum was available to fuel it. Daimler connected his engine to a two-wheeled vehicle. In 1885 he used his motor to drive a four-wheeled vehicle, which is generally recognized as the prototype of the first modern automobile. (Daimler's name remains current in the form of DaimlerChrysler, the company that manufactures Mercedes-Benz and Chrysler automobiles.)
One advantage of Daimler's design was that it provided a better balance of power to weight; that is, a lighter engine could produce more energy. Excessive weight was always a serious drawback of the steam engine. Steam could produce tremendous energy (to propel a train, for example), but steam engines tended to be large and heavy in order to withstand the pressures of the steam, as well as to carry large quantities of water to convert to steam and coal to heat the water.
Manufacturing automobiles, in turn, became the largest industry of the twentieth century. The automobile, a direct result of the Industrial Revolution, is an invention that brought about some of the most profound social changes associated with living in the industrialized world of North America, Europe, and Japan. For example, cars enabled individuals to live relatively far from their place of employment, facilitating the growth of suburbs and the concentration of poorer workers in city centers.
The electric era
Just as the discovery of oil enabled rapid changes brought about by motorized transportation, another form of energy that was developed in the second half of the nineteenth century resulted in equally dramatic social and economic changes: electricity.
Electricity is a form of energy caused by the presence of electrical charges in matter. Electrical energy can be generated from mechanical energy by a machine called a generator. A generator burns coal, oil, or natural gas, or uses nuclear energy, to boil water to make steam. The steam is then used to drive a turbine, a large shaft mounted with a series of blades that fan out from the center. As steam strikes the vanes, it causes the central shaft to rotate. If the central shaft is then attached to a very large magnet (a piece of metal that attracts iron or steel), it causes the magnet to rotate around a central armature (coils of wire wrapped around a metal core), generating electricity. The electricity flowing through wires delivers energy in a form that can be used to light lamps and to drive motors, both large and small.
Electricity is used today to run appliances in homes and machines in factories, but when it was first developed, it powered a communications revolution. The telegraph, a device invented in the mid-1800s that uses electricity to send instantaneous messages over long distances, minimized the constraints of time and distance, much as locomotives and steamboats had done for travel and shipping. The telegraph, like the telephone that came a generation later, enabled businesses to take orders, buy raw materials, and otherwise treat a whole country—and eventually the entire world—as if it were a local marketplace.
With the telegraph came improved transportation systems (the telegraph was crucial in helping railroad operators coordinate the movement of trains over railroad tracks) and the building of large central factories (where raw materials from far away could be transported by rail and manufactured goods could be delivered to distant customers). Through the telegraph, businesspeople could learn of new opportunities in new markets far away (or of disasters that might threaten their business) and respond quickly. Thus, technology helped stretch the limits of the human imagination, which was perhaps the greatest change of all to come out of the Industrial Revolution.
"What hath God wrought?" The telegraph
In the twenty-first century, instantaneous communications are taken for granted. Telephones are everywhere, and the Internet makes it possible to send messages, sounds, and pictures almost instantly nearly anywhere in the world. It is easy to overlook the novelty, and importance, of rapid communication when it first became available in the mid-1800s.
The person widely credited with the successful introduction of the telegraph was an American, Samuel F. B. Morse (1791–1872). In his early life, Morse was a moderately successful portrait painter who had moved from his home in Pennsylvania to pursue an art career in Britain. In 1832, sailing back to the United States from England, Morse heard about discoveries involving electromagnets made the previous year by the English scientist Michael Faraday (1791–1867). Electromagnets are temporary magnets that consist of a core of metal, such as iron, surrounded by a coil of wire. When an electric current is applied to the coil, it turns the metal into a magnet (which attracts, or sometimes repels, other metals), which could then perform a task, such as lifting a weight or causing another piece of iron to move.
Over the next three years, Morse worked on the concept of varying the flow of electricity over wires in order to send a signal. As is so often the case, it is impossible to credit one person alone with an invention, including the telegraph. Experiments dating to 1753 had tried to use electricity and magnetism as a means of communication. Some early devices used multiple wires to transmit messages (one used twenty-six wires, one for every letter of the alphabet). Early pioneers included William Sturgeon (1783–1850) of England, who in 1825 turned a small piece of iron, weighing just 7 ounces (198.4 grams), into an electromagnet by sending electric current through a wire from a battery. The little electromagnet was able to lift a piece of iron that weighed 9 pounds (4 kilograms). In 1830 the American physicist Joseph Henry (1797–1878) went a step further and sent a current through a wire a mile long, "magnetizing" a piece of metal that struck a bell (that is, sent a signal).
Nevertheless, it was Morse who put together a complete system of sending and receiving signals over a distance. His telegraph consisted of an energy source (a battery), an electromagnet, and an electric switch known as a key. As the key came in contact with the metal plate beneath it, an electric circuit (path) was completed. Electricity flowed out of the telegraph, into external electrical wires, and to waiting receivers. As the current came into the receiver, it caused the magnet to pull down a device that made a clicking sound (or punched a hole in a strip of paper). Morse also worked out a successful code (known as Morse code), comprising combinations of long and short bursts of electricity created by tapping on the key. Trained operators translated the combinations of longer and shorter bursts into letters, thereby sending and receiving messages. In 1843, Morse persuaded the U.S. Congress to provide funding to establish a demonstration of his system.
The following year, on May 1, 1844, a telegraph message was sent from Annapolis, Maryland, to Washington, D.C., a distance of about forty miles (sixty-four kilometers), announcing that Henry Clay had been nominated to run for president (Clay eventually lost). Still later, a more famous message went the other way, from Washington to Baltimore, when Morse, using Morse code, sent a message taken from the Old Testament, the first part of the Bible, that said: "What hath God wrought?"
Thus began intense competition to spread the telegraph throughout the United States. Morse and his business associates raised private funds to extend the telegraph all the way to New York, via Philadelphia, Pennsylvania. Small telegraph companies began linking smaller cities throughout the East, South, and Midwest.
Railroads played a key role in developing the telegraph. First, railroads let telegraph companies set up poles and string wires beside their tracks. Second, they used the telegraph to control train traffic: at the time, most railroads had a single pair of tracks between two points; they needed fast communications to avoid head-on, or even rear-end, collisions.
Morse's system printed the codes on a ribbon of paper, and operators learned to translate the series of dots and dashes into letters, at the rate of about forty-five words per minute. By 1914 that speed had increased to nearly one hundred words per minute. (In 2003 a high-speed cable modem connected to the Internet could transmit around one-and-a-half million words per minute.)
The telephone
The story of Alexander Graham Bell (1847–1922), the Scottish-born teacher of the deaf at Boston University, and his telephone is widely known. Instead of electric pulses that came out as dots and dashes, the telephone was able to translate the human voice into electrical impulses that were converted back to sound at the other end of a long wire.
The principle of the telephone had been shown first by Michael Faraday in 1831, but it took another forty-five years to demonstrate its practicality. As with so many inventions, it was an idea that many people were working on simultaneously. In fact, Alexander Graham Bell filed his application for a patent just two hours ahead of another inventor, Elisha Gray (1835–1901). The history of the telephone is also another demonstration of the principle that in the Industrial Revolution, inventing something new was never enough. Success went to the person who could expand an invention into a successful business which, in the case of telephones, meant spreading their reach into virtually every home and business in the United States.
In the twenty-first century, the telephone is every-where—even, thanks to wireless (satellite) technology, in people's pockets. It seems odd to suggest that the telegraph, now largely a technology of the past, was actually a more significant invention than the telephone in the history of the Industrial Revolution. The reason is an issue of timing. The telegraph proved to be critical in helping railroads expand; in turn, rapid and reliable transportation made it feasible to bring in more raw materials and ship out more manufactured goods, from ever larger factories. Railroads also played a key role in providing food from faraway farms and ranches to industrial cities in the East.
Coal Gas and Natural Gas
Compared to coal, petroleum, and electricity, flammable gas was a secondary source of energy in the 1800s. At first, a gaseous byproduct of manufacturing coke (a form of refined, or processed, coal that burns hotter than coal itself) called coal gas was collected and distributed by pipelines in many cities. Large outdoor lamps burned coal gas to provide the first effective streetlights; each evening, a lamplighter would come along the street and light the lamps. Coal gas lamps were also used to light stores, factories, and homes. It was many years before electric lighting succeeded in displacing gas lighting, which was at one time much cheaper and brighter than the first electric lights. Coal gas is seldom manufactured today.
Natural gas (as opposed to manufactured coal gas), found in huge underground reservoirs, is also distributed by pipelines. In areas where it is economical to do so, this naturally occurring gas is used for heating and for cooking, and to generate electricity in power plants. Natural gas is almost never used for lighting, but it continues to play an important role in providing energy to industry as well as to individual homes.
Considering its importance in modern life, it may be hard to imagine the telephone ranking in second place to the telegraph in the story of industrialization. But it would be hard to point to a change made possible by the telephone that had not already started with the spread of the telegraph.
Electric motors
Electricity, like petroleum, was a key enabling technology of the second phase of the Industrial Revolution. The introduction of electricity and its two main uses aside from communications—electric motors and electric lighting—introduced some of the most dramatic and pervasive social changes of the Industrial Revolution.
Humankind's fascination with electricity has a long history, dating to Greece in 600 b.c.e. The Greeks noticed that rubbing amber against a fur cloth caused particles of straw to cling to the cloth. The mystery of this phenomenon (which is today called static electricity) was not solved until 1600, when an English scientist, William Gilbert (1544–1603), who was Queen Elizabeth I's physician, investigated the phenomenon and first applied the word "electric" in a report about magnetism. In 1752 Benjamin Franklin (1706–1790), the American statesman and inventor, demonstrated that lightning and electricity were the same thing. In 1792 Italian scientist Alessandro Volta (1745–1827) invented the first form of a battery and demonstrated how electricity could flow through a wire connected to it. (The word "volt," a measure of electrical potential, honors Volta's role in advancing knowledge about electricity.)
In 1831 Faraday discovered how to generate a flow of electricity by moving a magnet inside a coil of copper wire. The electrical current generated by Faraday was small, but the principle of the generator was established. It was just a matter of time before generators grew in size and electricity came to power motors.
Unlike steam- or gas-powered engines, many machines during the second half of the Industrial Revolution were powered by electricity. An electric motor uses magnetism to move its parts. Magnets can be made quite small, and therefore electric engines can be made much smaller in scale than either coal-burning steam engines or oil-powered internal combustion engines. Indeed, unlike steam or petroleum-powered engines, electric engines do not require any combustion at the site where they are used (although large generators, which may be located far away, often use steam or petroleum to create electricity). The changes electric power brought to society were enormous: power could be delivered to every factory, business, and house without the need for bulky fuel and huge engines.
Aside from Faraday, two other inventors whose names are associated with electricity greatly advanced the second stage of the Industrial Revolution: Thomas Alva Edison and George Westinghouse. Edison is created with inventing or perfecting many uses for electricity, including a practical light bulb, while Westinghouse was key in developing ways to generate electricity on a large scale.
Thomas Alva Edison
Thomas Alva Edison (1847–1931) was a prolific inventor who found many new ways to use the power of electricity. He also was a successful businessman who succeeded in organizing huge companies to generate electricity and arranged for laboratories to carry on new experiments and find yet more uses of electricity.
By the end of his life, Edison had more than one thousand patents for a wide variety of devices. Some of these inventions, however, had actually been developed by people working for him, or had been purchased by Edison. Moreover, while many of the inventions with which Edison is credited (the light bulb, for example) are undoubtedly an essential part of modern life, they did not necessarily push forward the Industrial Revolution so much as add to the comfort and convenience of everyday life.
Edison did not have an extensive formal education. He had trouble in school and was largely educated at home by his mother, who was a teacher. As a child he disliked mathematics but was fascinated by chemistry. By age twelve he was out of the house, earning money selling newspapers, tobacco, and candy on a railroad that linked Port Huron, Michigan, and Detroit. Waiting for the train to turn around for a return trip gave Edison plenty of time to read on his own, a habit instilled by his mother. He also owned a chemistry set, with which he conducted experiments in the baggage car. Throughout his life, Edison believed in invention by trial and error.
Edison became interested in the newly invented telegraph, which was spreading rapidly to cities in the United States. After Edison saved the son of a stationmaster from falling beneath a train, the grateful father taught him Morse code, allowing Edison to get jobs as a telegraph operator, working in various cities across the United States and Canada. In this environment, Edison's natural tendency to experiment and innovate led to a machine that recorded the clicks coming in over the telegraph wires. He then adapted this technology and developed another machine designed to record the voice votes of legislators. Edison was granted his first patent—for the vote-recording machine—in 1869.
Later that year, in New York City, Edison developed a variation of a "printing telegraph," a device to record information about the price of gold. Edison, working with partners, sold the design for the gold "ticker" to Western Union (the main telegraph company of the era), and he also got a job with the company. A few months later, he modified his gold ticker to record the prices of stocks as they were bought and sold on New York's stock exchanges. This invention had a large potential market, and the president of Western Union rewarded Edison by paying him $40,000, money Edison then used to started his own company across the Hudson River in New Jersey.
Edison's new company manufactured stock tickers and also paid engineers to develop new electrical devices or to improve existing ones. Among machines developed at Edison's company were an early form of mimeograph (a copying machine), an improved typewriter, and a method of sending four telegraph messages across a wire at the same time (instead of just one). As a businessman, Edison was aggressive in obtaining patents for machines developed by his employees, and even for small improvements on existing machines—a practice that helped him gain more patents than any other individual in the history of the U.S. Patent and Trademark Office.
By 1876, profits from his business enabled Edison to set up what he called an "invention factory" in Menlo Park, New Jersey, which was the country's first corporate research laboratory. Having seen how new inventions could earn significant sums of money, Edison set a goal of developing one new invention every four days (he eventually succeeded in inventing something new every five days). The firm's inventions ranged widely and eventually resulted in an improved version of the telephone, the phonograph, and the incandescent light bulb.
An English inventor, Sir Joseph Swan (1828–1914), had come up with the idea of sending electricity through a thin wire encased in a glass bulb, to be used as a lamp. By creating a vacuum inside the bulb (sucking out the air), Swann prevented the wire (called a filament) from catching fire, but the heat of the electricity nevertheless caused filaments to melt or break. Thomas Edison tackled the problem and found that a piece of charred cotton lasted enough hours to make the electric light bulb practicable.
On New Year's Eve, 1879, in Menlo Park, Edison demonstrated his new bulb by lighting street lights along a half-mile stretch, as well as his company's building, with his new bulb. The electricity was supplied by a generator Edison had built and installed in his company headquarters. He thus demonstrated not only the light bulb, but also the principle of generating electricity at a central point and distributing it via wires strung over a significant distance. Two years later, Edison had built an electric generating station in New York City, where he supplied about 85 users with electricity that flowed from about 400 outlets.
It was this last development—central generation of electricity and distribution over a grid, or network, of wires throughout a city—that was Edison's biggest contribution to the Industrial Revolution.
But at the same time, Edison made what proved to be a strategic miscalculation. His error involved the nature of electric current (the flow of electrons over a wire). Edison had built generating equipment that sent electricity in one direction only, a system called direct current (DC). An eccentric immigrant from Croatia named Nikola Tesla (1856–1943), who had come to work for Edison's company in 1884, proposed a different approach: alternating the direction of the current. Tesla demonstrated that while direct current tended to decrease over distance, alternating current (AC) could be distributed over a much longer distance without losing its power. Using his own invention, Tesla also showed that the current of alternating electricity could be increased at a generating plant, then decreased near the final customer, making AC a more efficient means of distributing electricity.
Edison resisted Tesla's idea, partly because he had already invested money in devices that generated direct current, and partly because he thought (incorrectly) that AC was dangerous. But Tesla kept arguing for his idea, and Edison fired him in 1885, a year after he joined Edison's company.
George Westinghouse and Nikola Tesla
Tesla was a creative genius (he eventually held seven hundred patents) who had a tendency to exhibit odd behavior. He claimed he had communicated with creatures from other planets, for example, and he was terrified of women who wore pearl earrings.
After he was fired by Edison, Tesla arranged to work for the inventor George Westinghouse (1846–1914). Westinghouse had earned a fortune from his invention of the air brake, which used compressed air instead of friction to stop a moving vehicle and was widely used on trains; the airbrake patented in 1869, was one of 400 patents Westinghouse eventually owned. Westinghouse quickly saw the advantages in Tesla's alternating generator. In 1888, Westinghouse bought rights to Tesla's AC generator and agreed to use his fortune to develop the idea in competition with Thomas Edison.
Success did not come at once, however. But in 1893, Tesla and Westinghouse won a contract to provide AC power to the Columbian Exposition in Chicago, a kind of world's fair. Tesla's alternating current system was a huge success, and eventually it won out over Edison's direct current. (DC is still used, however, in batteries, which are normally located very close to the object they are intended to power, such as a flashlight or portable CD player.)
Widespread delivery of electricity made possible not only illumination in factories and homes, but also motors to drive machinery. Electricity took its place alongside the steam engine and the internal combustion engine as one of the key sources of energy driving the Industrial Revolution forward. Tesla's real vindication came in 1917 when he won the top award of the American Institute of Electrical Engineers: the Edison Medal.
Applied science changes the world
The first stage of the Industrial Revolution had been marked by the tinkering of talented individuals with a gift for mechanics. A century later, however, the course of the revolution changed as principles of science—particularly those in the fields of metallurgy and chemistry—were increasingly applied to technology.
Metallurgy and steel
Since about 1500 b.c.e., humankind has used iron as the metal of choice for making tools. Iron is an element, one of the fundamental substances of the Earth. It is found buried in the Earth's crust in the form of an ore, which means it is combined with other elements, such as manganese, silicon, phosphorous, sulfur, and especially carbon. Using iron to make objects is a two-step process: first, the iron ore is heated to around 2,800 degrees Fahrenheit (1,538 degrees Celsius) in order to remove impurities, such as sulfur. At this temperature, iron is in a liquid form. The high temperatures cause many other elements to vaporize, leaving a mixture of iron (about ninety-six percent) and carbon (about four percent). After stage one, it is called pig iron (so named because it is poured into containers that resemble baby pigs gathered around their mother).
Pig iron is normally subjected to further treatment to create other forms of the metal, including cast iron and steel. The principal treatment involves reheating pig iron to remove some of its carbon content or to introduce other elements (such as chromium) to give the final product the wanted characteristics. The amount of carbon mixed with the iron in particular affects the character of the end result. Iron may be relatively harder or softer, easier to shape with hammers or nearly impossible to dent, susceptible to bending or resistant to deformation. One mix of iron and carbon results in a product called cast iron, which is poured into shaped molds. Cast iron is widely used to make components of engines, for example.
Steel is another form of processed iron, in which the proportion of carbon has been reduced to less than 2 percent. Unlike cast iron, steel is relatively easy to bend into shapes (such as the exterior body of a car, for example). Steel can be strong even when pressed into thin sheets. Other elements can be added to steel to give it other desired qualities: stainless steel, with its shiny surface, resists rust when exposed to moisture, thanks to the addition of the element chromium. Metals created by mixing together separate elements (such as iron and chromium) are called alloys, and the study of how metals behave and how they can be altered is called metallurgy.
Metallurgists continually seek to develop new forms of metal, iron in particular, in order to make products that overcome specific shortcomings, such as the tendency to rust, or products that provide more advantages (strength combined with light weight, for instance). Humans have been experimenting with metallurgy for thousands of years; iron became the dominant metal for making tools about three thousand years ago, and steel was developed at least two thousand years ago. In some respects metallurgy could be called the science of "cooking" metals—heating iron, for example, cooling it, heating it again, mixing in other elements—in order to produce a product with the wanted characteristics.
In the middle of the nineteenth century, an Englishman named Henry Bessemer (1813–1898) developed a new method to convert pig iron into steel. Bessemer's method reduced the time and cost of processing pig iron into steel, long considered the most desirable form of iron because it is more durable, can withstand greater stress, and can bear greater weight than other forms of the metal. Bessemer's process made steel an economically feasible alternative for making a wide variety of manufactured products.
The key issue in making steel is reducing the carbon content. Doing so requires heating pig iron to around 1,300 degrees Fahrenheit (720 degrees Celsius), which takes both time and fuel (such as coal) burning in a furnace. In 1856 Bessemer discovered that blasting oxygen into molten iron actually increased its temperature and burned away much of the carbon that was still in the iron, resulting in steel. In his patent application Bessemer called his invention "a decarbonization process, utilizing a blast of air." His method involved pouring molten iron into a pear-shaped bucket, then inserting oxygen into the bucket (called a converter) in order to raise the temperature of the liquid iron and burn away impurities, notably carbon. Before Bessemer developed the technique, steel was made by hand in small quantities, typically for weapons (like swords) and hand tools. The process took about three hours and made relatively small quantities of around fifty pounds (twenty-two kilograms) of high quality, but expensive, steel. Steel made using Bessemer's process was not of the highest quality (that is, devoid of impurities), but it was relatively inexpensive to manufacture, and this development enabled steel to replace iron in many applications, including miles of railroad track and the interior framework of tall buildings.
Bessemer was not the only person trying to find a better way to turn pig iron into steel. An American businessman, William Kelly (1811–1888) of Eddyville, Kentucky, owned iron mines and also a furnace for making pig iron. In the early 1850s, he developed a steel-making process very similar to Bessemer's, but he did not apply for a patent until 1857, after Bessemer had received a patent in Britain. The two men quarreled for years over who had the rights to the method, but eventually Kelly ran out of money and dropped out of the battle, clearing the way for Bessemer to license his technique for converting pig iron into steel.
Bessemer's technique was widely adopted and used for more than a century. It was replaced in the 1960s by a newer technique that cut the time needed to heat pig iron from about nine hours to about forty-five minutes. A Scottish-born entrepreneur named Andrew Carnegie (1835–1919) licensed Bessemer's technique and built the largest steel-manufacturing company in the United States, earning one of the world's greatest fortunes from it (see Chapter 6).
Bessemer's invention marked a major turning point in the Industrial Revolution. It came at a perfect time: railroads were being built rapidly, especially across the United States, and engineers soon saw advantages in using steel for rails instead of iron. Steel also came to replace iron in the
Chemistry
For centuries, people have been fascinated with the nature of substances, and with how their characteristics change when they are heated or put together with other substances. In the Middle Ages (500–1400), some people experimented with ways to turn a cheap metal, lead, into gold, as a means of gaining an instant fortune. Such experiments were called alchemy, and if they worked at all (which was seldom), it was purely by chance.
Chemistry, on the other hand, is the organized, systematic study of the fundamental characteristics of substances. As a continuation of a wave of scientific discovery that had started during the Renaissance (c. 1400–1700), chemistry has led to a wide variety of discoveries, starting in the nineteenth century and continuing ever since. Many of these discoveries solved business problems, or made a major difference in everyday life. Chemists have had a variety of motives. Some chemists experimented from curiosity; others were looking for new materials that could replace more expensive ones and save money in manufacturing goods; still others sought substances that were more reliable or uniform than substances that occur in nature. The notion of using manufactured substances in place of natural ones became a major theme as the Industrial Revolution progressed, and it resulted in the establishment of chemistry laboratories by manufacturing companies looking to increase profits. Targeted research (that is, looking for solutions to specific business problems) grew into an important supplement to more general or fundamental research conducted by scientists working in universities.
One example of this new industrial scientific process came in the search for dyes used to add color to cotton or wool fabric. People had long made dye from plants (imagine squashing red berries and pouring the liquid onto white cloth), but manufacturers in the nineteenth century were looking for specific ways to save money and also to make goods that were uniform (the same shade of red, for example). They turned to chemists to find answers; in the case of dyes, it meant developing artificial chemical substitutes for naturally occurring organic dyes.
Dynamite is another important chemical discovery of the era. Dynamite was developed in 1866 by the Swedish chemist Alfred Nobel (1833–1896). It provided a relatively safe, inexpensive explosive that could be used to tunnel past rocks in a mine or clear away boulders blocking the route of a railroad. But Nobel also had a personal motivation. His bother Emil was killed in 1864 in an explosion of nitroglycerine, an explosive substance invented earlier by the Italian chemist Ascanio Sobrero (1812–1888). Nobel wanted to modify nitroglycerine to make it safer to use. He added silica, which enabled him to manufacture a paste that could be shaped into cylinders. These, in turn, could be inserted into holes drilled by miners or railroad workers to blast apart rocks. Sales of his invention made Nobel a wealthy man, and he later willed part of his fortune to fund the Nobel Peace Prize and other prizes for achievements in science and literature.
An American, Charles Goodyear (1800–1860), in 1839 invented a process to treat natural rubber (derived from the sap of rubber trees) called vulcanization. Goodyear's invention kept articles made of rubber from melting at moderately high temperatures (even a hot summer day could cause natural rubber to lose its shape). Vulcanization made rubber, with its natural resilience, a useful material for goods like tires, which gave a much smoother ride than wooden wheels.
Chemistry also yielded a new group of materials, called plastics. In 1869, the American John Wesley Hyatt (1837–1920) was looking for a substitute for ivory from elephant tusks to use in manufacturing billiard balls. Elephants were being slaughtered by the thousands to keep pace with the demand for these balls. Thinking about this problem, Hyatt noticed that when he spun a bottle of collodion (a glue-like solution containing cellulose from plant cells often used in medicine to close small wounds), it congealed into a tough, flexible film. Collodion was not successful for its intended purpose: it was highly flammable and caused billiard balls to explode when they crashed into one another. Hyatt solved this problem by adding camphor, derived from the laurel tree, to make celluloid. Celluloid could be heated and poured into a mold, then cooled, at which point it would retain its new shape (such as that of a billiard ball). It was hard and sturdy, like iron, but much lighter and easier to manufacture.
In 1907, Leo Baekeland (1863–1944), a Belgian-born chemist working in New York, was trying to develop a superior coating for the surface of bowling alleys, which were becoming popular. He combined carbolic acid and formaldehyde to create a substance he called Bakelite resin. The material could be poured into molds; once it cooled, it retained the shape of the mold and resisted heat and corrosion. Bakelite soon found a wide array of uses as a substitute for wood and metal that was relatively inexpensive to manufacture and shape.
By manipulating natural substances, such as cellulose, with heat and by stirring in other substances, chemists developed a wide range of artificial materials that could substitute for natural ones. Rayon, for example, was intentionally developed as a substance with chemical properties similar to the secretions of silkworms in order to manufacture an artificial and less expensive version of silk.
Chemistry has had a major impact on modern life. By developing new substances to replace more expensive older ones, chemistry has made it possible for manufacturers to produce goods at prices that most people can afford. It has made possible some inventions, such as movies (film for moving pictures was an early application of celluloid), that have changed the lives of millions of people. Chemistry has also had a major impact on one of the most fundamental human needs: food.
The second agricultural revolution
While the name Industrial Revolution may conjure up visions of urban factories belching smoke, during the nineteenth century important changes also swept across agriculture. These changes fall into three main categories:
- Chemistry, in the form of new fertilizers, pesticides, and herbicides.
- Farm machines, such as the tractor powered by an internal combustion engine, that substituted for the pulling power of horses and oxen and enabled farmers to work larger fields.
- Food preservation techniques that made it possible for farmers to sell food in distant cities, long after it was freshly harvested.
Together, these three developments enabled farmers to efficiently support a growing urban population—a trend that has not stopped. The introduction of machinery and science to agriculture had virtually the same impact as it had earlier on cottage industries: fewer farmers producing more food at a lower price. And as with the Industrial Revolution in towns and cities, small, independent farmers were gradually replaced by much larger farms, often owned by businesses that could afford the latest in machinery, fertilizer, pesticides, and herbicides.
Chemicals
Agriculture benefited enormously from the application of chemistry to industry. In a process that started in the 1820s and continued for nearly a century, German scientists took the lead in developing fertilizers that greatly improved the productivity of land. Some fertilizers were developed from natural substances (such as guano, or bat droppings); others were developed by combining elements (such as nitrogen and hydrogen) into forms that could fertilize fields. In yet another area, chemists developed pesticides (to kill insects) and herbicides (to kill weeds), both of which helped increase the yields from crops.
Machines in the field
As European Americans moved into what seemed to them the empty territory of what is today the American Midwest—and in the process displacing the great herds of bison and the Native American people who hunted them—the size of farms grew significantly. The increase in acreage was made possible in part by introducing a series of machines to farming.
One of the most significant of these inventions was the reaper, developed by Cyrus McCormick (1809–1884) of Virginia, to cut down ripe crops like wheat. The reaper was, in essence, a giant cylinder equipped with blades. As the cylinder rolled across a field, the blades came down and cut wheat stalks at ground level. Previously, people with hand-held tools called scythes (a kind of curved knife at the end of a long handle) did this work. The mechanical reaper enabled one person to do the work of many. And it was designed to gather up the fallen wheat into bundles, saving even more time. Not only could harvesting be completed by about half as many people using scythes, the reaper also saved crops from being ruined by rain after they were cut to the ground. McCormick patented his design in 1834, and three years later he started making the machines on his family estate, selling them door to door. Later he licensed others to build his reaper in other parts of the country. In 1847 McCormick established a factory in Chicago, Illinois, which became one of the leading manufacturing companies of the nineteenth century. McCormick himself became wealthy and later invested in railroads and mining.
The next major invention of the agricultural revolution was the self-polishing cast steel plow. A plow is a tool that is pulled through a field (by a horse or ox prior to the development of powered tractors) in order to cut little trenches, called furrows, in the soil. Seeds can then be planted in the furrows. As the plow's blade (called a plowshare) moves through the soil, bits of dirt or mud tend to stick to it, making it harder to pull and also resulting in a wider furrow than needed for the seeds. This was especially a problem with cast iron plowshares trying to cut through the thick soils of the Midwest. Farmers needed to stop and scrape the plowshare clean in order to proceed through the field. A better version of the plowshare was designed in 1837 by a blacksmith (someone who makes tools out of iron) named John Deere (1804–1886), who had moved to Illinois from his native Vermont the previous year. Noticing that the cast iron plows made in Vermont were ill-suited to the thicker, heavier soils of the Midwest, Deere substituted highly polished steel for the cast iron plowshare. Soil did not cling so easily to the smooth, shiny surface of Deere's plow, making work easier and faster for farmers, who soon bought Deere's plows by the thousands.
Later, Deere was responsible for another innovation: instead of making plow blades only to fill specific orders, he manufactured them in large quantities and traveled around to display and sell them. The company he founded is still in business in the twenty-first century, best known for producing farm tractors.
Mechanical tractors were introduced in 1868 as a replacement for the horse or ox in pulling plows through the field or hauling wagons full of harvested crops into town. Initially tractors were powered by small steam engines, like small locomotives, and they primarily were used like a truck, for hauling loads over roads. Gradually they were also used in fields to pull machines or wagons, replacing farm animals. The first gasoline engines were installed on tractors by the Charter Gasoline Engine Company of Sterling, Illinois, in 1887. Henry Ford (1863–1947), the pioneer of automobile manufacturing, began producing tractors that used gasoline engines in 1907; he called them automobile plows. Gasoline-powered tractors were popular with farmers because they were easier to operate and enabled a single farmer to plant or harvest more land than was possible with either animals or steam-powered tractors.
The history of agriculture in the nineteenth century is filled with many other inventions that brought the advantages of engine power and mechanical help to the farmer. These advantages can be measured. In 1830, for example, it required 250 to 300 man-hours (one person working one hour) to produce one hundred bushels of wheat grown on about five acres of land. A century later it required only about 15 to 20 man-hours to produce the same amount of wheat on the same amount of land.
From farm to table
The third key part of modern agricultural revolution centered around food processing and preservation techniques (such as canning and refrigeration) that allowed farm goods to be delivered to urban customers far away. These developments owed a great deal both to the application of technology and to the organizational changes associated with the Industrial Revolution. Thus the story of food in the nineteenth century is the story of how the techniques of industrialization gave rise to companies that package and preserve food and deliver it to grocery stores.
In the era before the Industrial Revolution, the majority of Americans lived in the countryside. Their food came from the farm they lived on, or perhaps from a farm next door. People ate fresh fruit in the summer or early autumn. In the winter or spring, they ate preserves, fruit that had been cooked and stored in jars. Eggs came fresh from the farm's hen. But people living in crowded cities do not keep chickens or raise their own fruit. In order to provide them with food, methods were developed in the 1800s to ship food from the countryside to cities, and to keep it from rotting on the way.
Food preservation, like many other developments during the second phase of the Industrial Revolution, was the result of many discoveries and inventions rather than one big leap forward. Food preservation was critical to the growth of the United States; combined with efficient transportation over long distances (via railroad), it enabled the United States to support a large population of urban industrial workers.
Canning food, that is, sealing cooked food inside a glass bottle or metal can, protecting it from contaminants in the atmosphere, was developed in 1795 by a French chef, Nicholas-François Appert (c. 1750–1841). Appert was competing for a twelve-thousand-franc prize offered by the French Emperor Napoléon Bonaparte (Napoleón I; 1769–1821), who was searching for a way to preserve food to feed his army. Appert demonstrated a method of putting food (such as fruit) in glass containers, cooking it by putting the jars in boiling water, and then sealing the containers.
A few years later, Peter Durand of Britain demonstrated a similar technique, substituting iron cans coated with tin for Appert's glass bottles. Durand's solution was not ideal, however. Solder (pronounced SOD-er; a metal used to make the cans) resulted in lead poisoning if too much canned food was eaten over a short time. Durand's cans also raised another problem: the production of the cans themselves. Initially, a craftsman could produce perhaps sixty cans per day; later, machines were developed that produced hundreds of cans per minute.
Nonetheless, the canning principle had been established and its use spread. Canned food was widely used to supply armies during the American Civil War (1861–65).
Food Preservation
There are five basic ways to prevent food from decaying:
- Freezing: Humans have long known that freezing meat, and other food, slows or prevents the chemical process of decay. Archaeologists have uncovered evidence that as long as 10,000 years ago, cave-dwelling humans put slaughtered animals in the coldest part of their caves to preserve the meat by freezing.
- Heating: Cooking food (raising its temperature to a certain level) kills many bacteria (microscopic organisms) that might make it unsafe to eat. For cooked food to remain edible, it must be sealed in glass jars or metal cans to keep it away from airborne organisms that might contaminate it.
- Dehydrating: Dehydrating food, or removing the water it contains, has long been used to preserve foods. One example of a dehydrated food is pasta. Pasta is essentially flour and water that is formed into a shape, such as a spaghetti noodle, and then dried. Removing the moisture prevents chemical reactions that result in decay or rotting. When pasta is put into boiling water, the flour is rehydrated to make it edible.
- Fermentation: Fermentation uses chemical reactions brought about by acids to avoid spoilage. These acids are typically created as a result of chemical reactions between specific microorganisms, such as bacteria, molds, or yeasts, and basic food materials (such as cow's milk) to create an edible, nutritious substance that resists decay or spoilage (contamination by poisonous bacteria). Cheese is a common example of converting a food substance and preserving it by use of fermentation.
- Chemicals: Chemical reactions have been used for centuries to preserve food by killing poisonous bacteria. The oldest and most common chemical used to preserve food is sodium chlo-ride (salt), which was used throughout the Middle Ages (500 to 1400) to store fish and meat. Some spices contain chemicals that also kill poisonous bacteria (as well as providing a lively taste to food).
During the nineteenth century, scientists came to understand the theories behind many of these ancient food preservation techniques and began to apply them on a large scale.
Refrigeration was another nineteenth-century invention that changed the face of food distribution. The scientific principle of refrigeration—transferring heat from one object to another—was developed by the French scientist Nicholas Sadi Carnot (1796–1832) in 1824. But it took many years before mechanical refrigeration replaced the age-old method of using ice to keep certain foods (especially vegetables and meat) relatively fresh and safe to eat. Well into the twentieth century, ice men delivered chunks of ice to homes, where it was stored in insulated ice boxes to help preserve food.
Contemporary home refrigerators and air conditioners are the result of a long process of scientific inquiry and invention. As early as 1805 American inventor Oliver Evans (1755–1819) designed, but never built, a refrigeration machine. The theory behind Evans's system, and the basic principle of refrigeration today, is that whenever a liquid changes into a gas it absorbs heat, thereby cooling its environment. In 1842 a physician, John Gorrie (1803–1855), built a machine in Florida to cool hospital rooms following Evans's basic design.
Although Gorrie did not receive a patent for his system until 1851, refrigerated railroad cars were used to carry milk products as early as the 1840s. J. B. Sutherland of Detroit, Michigan, received the first patent for a refrigerated railroad car in 1867. The temperature-controlled cars enabled Chicago, Omaha, and other midwestern cities to become centers of the emerging meatpacking industry; cattle were raised, slaughtered, and butchered in the West, and then their meat (instead of live cattle, as was done previously) was shipped to cities in the East.
For many years, refrigeration was limited to large machines used in factories, especially in the beer manufacturing industry. By the 1890s increasing water pollution (caused by dumping raw sewage, or human waste, into streams, lakes, and the ocean, for example) made it difficult to find supplies of natural ice that did not pose a health risk all by itself. Consequently, in the 1890s mechanical refrigeration was used to produce ice that was delivered, often daily, to individual homes to keep ice boxes cold.
Later still, principles of refrigeration were extended to cooling the air for comfort. American Willis Carrier (1876–1950), having observed that printing on paper worked better in the cooler temperatures of winter than in the heat of summer, in 1923 devised a method of applying refrigeration to lower the temperature in factories and public buildings (air conditioning), in large part to enhance human comfort.
Refrigeration was an outstanding example of how principles of science were applied over the period of a century to advance the notion that human inventions could change some of the fundamental characteristics of nature, whether it was the natural limitations of muscle power or the temperature of Earth's atmosphere.
For More Information
Books
Berman, Daniel, and Robert Rittner. The Industrial Revolution: A Global Event. Los Angeles, CA: National Center for History in the Schools, 1998.
Butterworth, W. E. Black Gold: The Story of Oil. New York: Four Winds Press, 1975.
Danhof, Clarence H. Change in Agriculture: The Northern United States, 1820–1870. Cambridge, MA: Harvard University Press, 1969.
Davis, Henry B. O. Electrical and Electronic Technologies: A Chronology of Events and Inventors to 1900. Metuchen, NJ: Scarecrow Press, 1981.
Dudley, William, ed. The Industrial Revolution: Opposing Viewpoints. San Diego, CA: Greenhaven Press, 1998.
Fisher, Douglas A. The Epic of Steel. New York: Harper and Row, 1963.
Gross, Ernie. Advances and Innovations in American Daily Life, 1600s–1930s. Jefferson, NC: McFarland, 2002.
Horwitz, Elinor L. On the Land: American Agriculture from Past to Present. New York: Atheneum, 1980.
Meyer, Herbert W. A History of Electricity and Magnetism. Cambridge, MA: MIT Press, 1971.
Muir, Diana. Reflections in Bullough's Pond: Economy and Ecosystem in New England. Hanover, NH: University Press of New England, 2000.
Pursell, Carroll W. The Machine in America: A Social History of Technology. Baltimore, MD: Johns Hopkins University Press, 1995.
Sharlin, Harold I. The Making of the Electrical Age: From the Telegraph to Automation. London and New York: Abelard-Schuman, 1964.
Standage, Tom. The Victorian Internet: The Remarkable Story of the Telegraph and the Nineteenth Century's On-line Pioneers. New York: Walker and Co., 1998.
Periodicals
"Before Fridges: The Ice Trade." Economist, December 21, 1991, p. 47.
Cummins, Lyle. "Rudolf Diesel—The Man and His Mission." Diesel Progress, July 1985, p. D34.
Gray, Paul. "Thomas Edison (1847–1931): His Inventions Not Only Re-shaped Modernity but Also Promised a Future Bounded Only by Creativity." Time, December 31, 1999, p. 184.
Gustaitis, Joseph. "Samuel Slater: Father of the Industrial Revolution." American History Illustrated, May 1989, p. 32.
John, Richard R. "The Politics of Innovation." Daedalus, Fall 1998, p. 187.
Johnson, Jeff. "Nikola Tesla: Genius, Visionary, and Eccentric." Skeptical Inquirer, Summer 1994, p. 368.
Leone, Marie, et. al. "Edison and Tesla: The Founding Fathers of Electricity." Electrical World, January-February 2000, p. 41.
Lieberman, Beth. "The Elemental Sparks." Smithsonian, February 2001, p. 44.
"The Memory of Samuel Slater." Yankee, August 1999, p. 108.
Morse, Minna Scherlinder. "Chilly Reception: Dr. John Gorrie Found the Competition All Fired Up When He Tried to Market His Ice-Making Machine." Smithsonian, July 2002, p. 30.
Rosenberg, Nathan. "The Role of Electricity in Industrial Development." Energy Journal, April 1998, p 7.
Usselman, Steven W. "From Novelty to Utility: George Westinghouse and the Business of Innovation during the Age of Edison." Business History Review, Summer 1992, p. 251.
Web Sites
"History of American Agriculture, 1776–1990." U.S. Department of Agriculture Research Service.http://www.usda.gov/history2/text4.htm (accessed on January 31, 2003).
"The History of Oil." U.S Department of Energy.http://www.fe.doe.gov/education/oil_history.html (accessed on January 31, 2003).
"The History of the Automobile: The Internal Combustion Engine and Early Gas-Powered Cars." About.com.http://inventors.about.com/library/weekly/aacarsgasa.htm (accessed on February 17, 2003).
"Oil, Our Untapped Energy Wealth." U.S. Department of Energy.http://www.fe.doe.gov/education/ (accessed on January 31, 2003).
Taylor, Frederick. The Principles of Scientific Management. First published 1911. Fordham University. http://www.fordham.edu/halsall/mod/1911taylor.html (accessed on January 31, 2003.)