Computers and Computer Industry

views updated

COMPUTERS AND COMPUTER INDUSTRY

COMPUTERS AND COMPUTER INDUSTRY. The electronic digital computer is the herald of the Information Age. Just as technologies developed in earlier ages liberated people from physical toil, computers have liberated people from the more tedious kinds of mental toil—and have revolutionized the transfer of information. The banking, insurance, and travel industries, to name a few, are vastly quicker and more responsive than they were a half-century ago. The computer industry employs hundreds of thousands directly, but many millions of people outside the industry use computers as an important tool in their jobs.

Calculating devices such as the abacus have existed for thousands of years. The distinctive feature of modern


computers is that they are digital, operating on digits of 1s and 0s according to specified instructions. Computers are therefore programmable. A programmer can create complex behavior undreamed of by the computer maker, just as a novelist can use a typewriter to create new works of art.

The "Difference Engine" of English mathematician Charles Babbage (1792–1871) was an ancestor of the computer. Babbage proposed it as a calculating machine to improve the accuracy of celestial tables used in navigation. Human error introduced wrong numbers into these tables, costing lives at sea. To further limit the effects of human error, Babbage wanted to automate the whole process of entering numbers and combining results, so that complex formulas could be automated, or "programmed." To this end he designed the Analytical Engine with two parts: one part read and interpreted coded instructions from punch cards, and the other performed arithmetic. Babbage kept altering his designs, and the English government withdrew its support in frustration. Yet most of the elements of modern digital computers were present in Babbage's plans.

Although Babbage's concepts were essentially neglected for more than a hundred years, other developments took place. In 1886 William Burroughs built the first commercially successful adding machine. In 1936 Cambridge mathematician Alan Turing described a theoretical machine that manipulated symbols on a tape. By implication, computers were not limited to number crunching. Given specific, clear instructions, they could manipulate any kind of data. During World War II, Turing designed a working mechanical device (the Turing Bombe) that broke the German Enigma code and demonstrated the power of Turing's ideas.

The First Electronic Computers

Two factors spurred development of computers in the mid-twentieth century. One was the war effort, which needed quick calculation of ballistic paths. The other was electronics, which made it possible to use wires and vacuum tubes to simulate logical operations. Engineers could replace Babbage's slow interactions of levers and gears with particles whose speed approached that of light.

During World War II the U.S. Army funded development of a digital computer, planning to use it for military calculations. On 14 February 1946 J. Presper Eckert and John Mauchly unveiled the first electronic digital computer at the University of Pennsylvania. They called their machine the Electronic Numerical Integrator and Computer (ENIAC). ENIAC required 1,800 square feet and 18,000 vacuum tubes. The use of vacuum tubes made it unreliable, for their combined heat often caused one or


more to malfunction. Each month operators had to replace 2,000 tubes. Yet for all its limitations, ENIAC proved useful for its time. It performed a then-impressive 5,000 additions and 360 multiplications per second.

Improvements soon followed. Hungarian-born scientist John von Neumann (1903–1957) suggested the idea of storing programs in memory alongside data. This freed a user from having to reprogram a computer every time it was used for a different purpose. (Consider that when you run a personal computer, you can run different programs with the click of a button.) So significant was this advance that computer scientists sometimes refer to modern computers as "von Neumann processors."

Although Eckert and Mauchly's invention changed the world, they never reaped the financial rewards that others did. They left the University of Pennsylvania in 1946 to form the Eckert-Mauchly Computer Corporation. But production of computers required capital, and shortly afterward they sold their firm to the Remington Rand Corporation. Joining Rand as engineers, they produced the Universal Automatic Computer (UNIVAC) in 1951. The first model was installed at the U.S. Bureau of the Census.

From the start UNIVAC was intended for general commercial use. But throughout the 1940s and 1950s computers remained expensive, and commercial acceptance came slowly. By 1957 only forty-six UNIVAC machines were in use. The high cost of computing power can be seen by looking at a late-model UNIVAC. This machine offered a 1.3 megahertz processor, half a megabyte of RAM (Random-Access Memory), and a 100 megabyte hard drive—all representing a fraction of the power available by the early 2000s for less than $1,000. In 1968 this UNIVAC model could be had for $1.6 million.

Data storage became important, especially for commercial uses, and devices such as magnetic tape and drums came into use. Meanwhile, new programming languages were developed. FORTRAN (FORmula TRANslation) and COBOL (COmmon Business Oriented Language) facilitated easier, faster writing of scientific and business applications, replacing much of the work being done with machine code.

The Rise of IBM

In contrast to Rand's commercial difficulties with the UNIVAC, International Business Machines (IBM) succeeded so well in the 1950s and 1960s that for a time it became synonymous with the computer industry itself. A merger in 1911 formed the company as the Calculating-Tabulating-Recording Company, and later Thomas J. Watson Sr. took over the company, renamed it, and expanded the product line, overseeing production of its first computers. The field was a natural one for the company to expand into, because it already made tabulating machines that used punch cards.

As much as any technical innovation, IBM's army of salesmen and its marketing expertise contributed to its success. Beyond selling machines, IBM sold a reputation for service and support. At the time a computer was not a commodity but a major investment, and IBM's size and solidity reassured customers. This strategy served the company well up until the era of personal computing. Spurred on by the Korean War, IBM developed the 700 series to meet the needs of the Defense Department. Meanwhile, IBM's lower-cost 650 series and 1400 series brought commercial success, selling 1,800 and 12,000 units respectively.

During the late 1950s Thomas J. Watson Jr. succeeded his father as chairman and decided to invest $10 billion in a new line of computers, the 360 series. For the time, this was an astounding sum of money. Watson's gamble was the most expensive development ever attempted in any private industry. It amounted to betting the company. The investment paid off. For several decades the 360 series (and its successor, the 370 series) secured IBM's dominance in the field of large computers—now called "mainframes"—and demand mushroomed. Around the world computers began to take over tasks previously relegated to roomfuls of clerks: compiling statistics, retrieving data, calculating actuarial tables, and printing company payrolls.

From Mainframe to Minicomputer

The growing acceptance of computers in the corporate world was aided by further developments in applied physics. In 1948 physicists at Bell Laboratories (including controversial Nobel Prize-winner William Shockley) invented the transistor; this is a device that enables a small current to control another, potentially larger current. By placing different kinds of transistors together, engineers can simulate logical operations such as AND, OR, and NOT. Transistors can therefore act as building blocks for digital processors, just as vacuum tubes once did.

Transistors offered many advantages over vacuum tubes. Because they produced almost no heat, they could be placed close together; this made miniaturization possible, in turn reducing the distance that electrons had to travel. Transistors improved speed, power, and reliability—all while lowering cost.

Individual transistors began to replace the use of vacuum tubes in the 1950s. But that was a small change compared with what followed. The 1960s saw the development of integrated circuits, combining many transistors on a small rectangle (or "chip") of silicon. As of the early 2000s, a silicon chip, not much bigger than a postage stamp, could contain more than 20 million transistors.

The availability of greater computing power at much lower cost led to fragmentation of the market. At one end better mainframes were developed to take advantage of greater computing power. At the other end new, cheaper lines of computers made computing accessible to smaller organizations, and prices fell dramatically.

This smaller type of computer was dubbed the "minicomputer." In some ways the name is misleading because, although smaller than a mainframe, a minicomputer is larger than a personal computer. (Personal computers were at first called "microcomputers.") There is no absolute dividing line between mainframes and minis; the distinction is partly subjective. Generally, a machine is a "mainframe" if it is among the larger and more powerful computers that the technology of the day can produce. Minicomputers are smaller and more affordable.

The field of minicomputers became the focus of a company founded in 1960 named Digital Equipment Corporation (DEC). The company produced the PDP (Program Data Processor) line of computers and, later, the VAX (Virtual Address eXtension) line. These enjoyed particularly wide use at universities. DEC revolutionized the business by pioneering the concept of time-sharing, first developed at the Massachusetts Institute of Technology.

Traditionally, only one program ran on a computer at a time. Users had to submit programs in punch-card form to a system administrator. The results might be returned the next day. Time-sharing, in contrast, switches control of the computer between multiple users several times a second. (During each switch the computer saves the work of the previous user and restores the work of the next.) In this setup each user communicates with the computer by means of a monitor and a keyboard, and each user has the illusion that he or she is the only one. This was a major step toward personal computing.

The Arrival of the Personal Computer

The development of better integrated circuits led ultimately to the placement of a complete central processing unit (CPU) onto a single chip. In the early 1970s the California-based Intel Corporation was the first to produce such a chip, dubbed the "microprocessor." A personal computer uses one microprocessor, along with other chips to control memory and peripheral devices. Main-frame and minicomputers can run many microprocessors in parallel.

The first commercially available personal computer was the Altair, announced in 1974. The customer received a kit requiring hours of difficult assembly. Programming the finished machine was just as complex. Still, for $397 a hobbyist could claim to have a working computer. William Henry "Bill" Gates III and Paul Allen formed Microsoft in 1975 to sell a usable BASIC (Beginner's All-purpose Symbolic Instruction Code) language for the Altair, making it easy to program.

In 1977 Steven P. Jobs and Stephen G. Wozniak founded Apple Computer, which produced a more successful product, the Apple II. Unlike the Altair, the Apple II included a keyboard and a monitor, came assembled, and could do useful work. At first Jobs and Wozniak assembled computers out of a garage, but they soon became multimillionaires. The success of Apple proved the commercial viability of personal computing. Other companies, such as Commodore and Tandy, soon announced personal computers of their own.

In 1981 IBM stepped into the fray with its own personal computer, dubbed "the PC." IBM realized it was coming late to the market and needed to get a product out fast. To this end it set up an independent division in Boca Raton, Florida, which met its deadlines by using off the shelf parts and an operating system from Microsoft.

IBM chose a different strategy from Apple, which had a closed system: users had to buy parts from Apple, and opening the system box voided the warranty. (The system box is the heart of a personal computer, containing all the important parts except for the monitor, keyboard, printer, and external devices.) IBM adopted an open architecture, leaving users and equipment manufacturers free to open the system box and make modifications. This contributed to the PC's success. Although the first models were limited, they could always be upgraded. And everyone wanted the IBM label.

The company made one other fateful decision. It did not buy Microsoft's MS-DOS operating system outright but rented a license, paying a fee for each computer sold. IBM let Microsoft keep the rights to license the system to others, never foreseeing the extent to which other companies would capitalize on the PC's success by developing low-cost "clones." To build a clone, a manufacturer needed only to purchase the Intel processor, emulate the PC's low-level behavior, and lease Microsoft's operating system. After a few years, Microsoft and Intel became the true designers of the PC environment (or "platform"), deciding what features would go into the next version of MS-DOS.

During the 1980s two designs—the Apple Macintosh and the PC—drove out the competition. The PC's advantage, in addition to open architecture, was that thousands


of programmers wrote software for it; soon it had a huge base of programs and users. Apple stayed competitive by introducing the Macintosh in 1984; it was the first affordable graphical user interface (GUI) system (after a flirtation with the more expensive Lisa, named for Jobs's daughter).

The concept of a GUI was developed by Xerox in the late 1970s, although Apple for a time claimed ownership (a claim rejected during Apple's lawsuit against Microsoft, which ended in 1994). A GUI replaced the use of cryptic command names with menus, icons, and a pointing device called a "mouse." Microsoft saw the potential of such systems. But while the Macintosh's graphical features were built into its Read-Only Memory (ROM), Microsoft had to produce a system loaded from disk, like ordinary software. It also had to fit this system into the more limited memory of a PC. In 1986 Microsoft released its GUI system, called Windows. At first it was clumsy compared with the Macintosh and was not widely accepted. But eventually a better interface and improvements in PC hardware made Windows a success. In 1995 Microsoft celebrated by releasing its Windows 95 with great fanfare. In the early 2000s approximately 90 percent of personal computers were PCs running Microsoft Windows.

Changes Wrought by the Internet

Just as personal computers had allowed Microsoft to displace IBM, many observers felt that a new technology might allow even newer companies to displace Microsoft. This technology was the Internet, originally sponsored by the Department of Defense in the late 1960s as the ARPANET. The purpose of this system was to enable military communication after a nuclear attack. In the early 1990s the World Wide Web was launched to share information not just between government agencies, but also universities, nonprofit organizations, and private companies (the last designated by the ".com" suffix), among others. As a "hypertext" system, the Web supported links functioning as automated cross-references. The Internet provided the infrastructure for the World Wide Web, as well as for sending and receiving electronic mail (E-mail) and for downloading files.

Much information accumulated on the Web, but none of it was available to ordinary users. This changed in 1993, when University of Illinois student Marc Andreessen developed the Mosaic browser. The university retained ownership of Mosaic, but Andreessen formed Netscape Communications with Jim Clark in 1994 and produced an even better browser, called Navigator. (Netscape was originally named "Mosaic Communications" but changed its name when it was unable to acquire rights to the Mosaic browser.) Within three years Netscape grew to a size that it had taken Microsoft its first eleven years to attain. This success helped initiate the "dot com" boom of the 1990s. In 1998 America Online (AOL) acquired Netscape Communications.

The Internet opened up new areas for computing. Previously, millions of people purchased computers but used them mainly for word processing and possibly balancing their checkbooks. As they signed up for Internet service providers such as AOL and MSN (Microsoft Network), people found that features such as E-mail, news, and stock quotes made computers more useful than ever. They could also use the Web to do research and buy products.

Several companies, including Netscape, AOL, and Sun Microsystems, a maker of minicomputers, saw in the Internet an opportunity to change the industry. Microsoft had defined the single-user PC environment, but operations on a single computer were no longer as important as information shared between computers. Sun Micro-systems developed a new programming language, Java, to take advantage of this fact. Java used a system of universal codes understood by different computers (each with its own interpreter). Because the underlying machine was no longer important, went the theory, Java itself would become the defining architecture rather than Windows.

Microsoft responded by embracing the Internet. Products such as Microsoft Word incorporated Web access into their design. More controversial was Microsoft's inclusion of its own Web browser (Explorer) into Windows itself. Critics contended that this undercut Netscape's browser by essentially distributing Explorer for free. The charge contributed to the antitrust case against Microsoft, which alleged that Microsoft took advantage of its monopolistic power due to Windows' success. In January 2000 Judge Thomas Penfield Jackson issued a decision to split the company. An appellate panel overturned this penalty in summer 2001, citing bias of the judge as revealed in published interviews.

Other Developments

In the mid-1960s Intel cofounder Gordon Moore predicted that for a given size of silicon chip, the amount of computing power would double every twelve to eighteen months. By the early 2000s this law continued to hold. New possibilities included moving from solid-state transistors to optical storage, using photons as the bearers of information. Meanwhile, as Internet subscribers moved from standard modems to digital subscriber lines (DSL), access times for the Web decreased by a factor of more than a hundred. The trend was toward the general merging of television, radio, and computers.

But technical advances brought new risks as well. As more people began to use electronic mail and the Web on a daily basis, the world became more vulnerable to computer viruses and worms: programs that attach themselves to applications, make copies of themselves, and then use the Internet to infect other computers. Computer security increasingly became an issue for individuals, government, and corporations.

While the Internet changed computing at the low end, advances continued at the high end, where main-frames evolved into supercomputers. In 1964 Seymour Cray built the CDC 6000 with parallel processing. This remained the most powerful computer in the world for years. He went on to start Cray Research, specializing in ever-more-powerful computers with important uses in research, mathematics, and space exploration.

A landmark demonstration of supercomputer power occurred in 1996, involving the Deep Thought computer program (named for a godlike computer in Douglas Adams's novel The Hitchhiker's Guide to the Galaxy). Able to examine 100 million positions a second, Deep Thought won a chess game against world champion Gary Kasparov. To some this was a demonstration that computers were finally "smarter" than the most intelligent humans. But the demonstration was incomplete; computers have always handled logical calculations better than people. Far more mysterious realms of human consciousness—emotion, creativity, and the ability to write good poetry—remain unconquered. For all their progress, computers remain the servants of the human race, not the masters.

BIBLIOGRAPHY

Carroll, Paul. Big Blues: The Unmaking of IBM. New York: Crown, 1993.

Deutschman, Allan. The Second Coming of Steve Jobs. New York: Broadway, 2000. A fascinating account of what happened to Jobs after leaving Apple, including his role in 3-D computer animation.

Downing, Douglas A., Michael A. Covington, and Melody Mauldin Covington. Dictionary of Computer and Internet Terms. 7th ed. New York: Barron's, 2000. A compact book with a wealth of explanations on all areas of computing.

Hodges, Andrew. Alan Turing: The Enigma. New York: Simon and Schuster, 1983. The story of Turing's remarkable life.

Ichbiah, Daniel, and Susan L. Knepper. The Making of Microsoft. Rocklin, Calif.: Prima, 1991. One of the first accounts of Microsoft's early years, it remains an interesting account of one of the world's most successful companies.

Kaye, Barbara K., and Norman J. Medoff. The World Wide Web: A Mass Communication Perspective. Mountain View, Calif.: Mayfield, 1998. A straightforward guide to basics of the Web and the Internet.

McCartney, Scott. ENIAC: The Triumph and Tragedies of the World's First Computer. New York: Walker, 1999.

Segaller, Stephen. Nerds 2.0.1: A Brief History of the Internet. New York: TV Books, 1998. The history of the Internet from its beginnings through the start-ups of the 1990s.

Wallace, James, and Jim Erickson. Hard Drive: Bill Gates and the Making of the Microsoft Empire. New York: Wiley, 1992. Contains in-depth portraits of some of the leading players in the business.

BrianOverland

See alsoInternet ; Microsoft ; Silicon Valley .

More From encyclopedia.com