Microchip

views updated Jun 11 2018

Microchip

Although semiconductors and microchips are essential components of modern computers, many people do not realize that computing machinery does not really need to be constructed with components that are normally associated with electronic equipment. In fact, some of the earliest computers were purely mechanical machinesthey did not rely on electrical technology at all. For example, Charles Babbage's Analytical Engine, designed in 1834 at a time when the use of electricity was in its infancy, was a purely mechanical machine. Had Babbage actually been able to build it, his Analytical Engine would have been a bona fide computing machine.

Similarly, many of the early computers and calculators were mostly mechanical, using carefully constructed linkages, levers, and cogs. It is important to note that the technology used to implement computers does not define them. Instead, machines are termed computers if they are programmableregardless of the form the programming takes. Therefore, once mechanical computers and calculators had proven themselves somewhat cumbersome and inefficient, designers looked toward the then newly emerging electro-technologies as a means for implementing computers and calculators.

Around the mid-twentieth century, the analog computer was becoming an increasingly popular tool for solving differential equations. Valve and triode devices used in analog amplification equipment were being mass-produced for the radio and wireless sets that were consumer items of the day. They were also suitable building blocks for the implementation of analog computers. Yet, while analog computers were predecessors of modern digital computers, they did not bear much resemblance to current digital computers.

To explain the development of these technologies, it is helpful to analyze their development history. Scientists and mathematicians have known since the eighteenth century that differential and integral calculus can be used to model problems in the physical sciences. Also, while solutions to differential equations can be developed manually, this process tends to be tedious. Analog computers offered a way of automating the process of generating solutions to differential and integral equations. Building blocks made from valves and triodes were constructed to perform specific operations that are common in the solution of differential and integral equations. Blocks that could complete arithmetic operationssuch as addition, subtraction, multiplication, and divisioncould be assembled, along with others that affected operations like integration, differentiation, and other forms of filtering. These building blocks could be assembled and connected using temporary wiring connectionsthis was actually the programming of these computers.

In the beginning, programming an analog computer was a rather labor-intensive activity and the computers themselves would consume a relatively large amount of electrical power. But their speed of computation was phenomenal compared to mechanical computers. The valves and triodes that made these machines possible are still occasionally found in esoteric modern audio amplifier equipment, but have been largely consigned to history. The cause of this was the invention of the semiconductor transistor device in 1948.

Solid-state physicist William B. Shockley (19101989) described the operation of the semiconductor transistor in 1950, and its development foreshadowed a revolution in electronics. The fundamental physical difference between a conductor of electricity and an insulator is that conductors permit free flow of electrons, and insulators do not. In other words, if someone takes a piece of conducting material, like a metal, and drops a packet of electrons onto it at one point, they will almost instantaneously redistribute themselves throughout the volume of the metal sample. They will tend to spread out so that their distribution is uniform. A piece of insulating material, like polyvinyl chloride (PVC plastic), would tend to resist the redistribution of a packet of electrons. The PVC would try to prevent the localized collection of electrons from redistributing themselvesinstead they would be contained in the one area making that region negatively charged.

Shockley and his contemporaries discovered that there was a certain class of materials that could sometimes be seen as acting like conductors, but with a certain amount of manipulation, the same material could be made to act as an insulator. This property made them somewhat specialthey could conduct or insulate under control, making them ideal as switching devices. These materials became known as semiconductors, as a result of their unusual position logically between conductors and insulators. Silicon and germanium were identified early as semiconductor materials.

The production process for the creation of a semiconductor is a complex multistage activity, but essentially involves minuscule semiconductor elements being impregnated with charged particles (known as doping) so as to influence their behavior in useful ways. They are bonded to conductors and encased in plastic or ceramic containers ready for use. Since this time, the word "silicon" in the context of electronics has been used synonymously with terms such as "silicon chip," "chip," and "microchip."

Silicon, germanium, and other semiconductor materials derived from metal oxides have been used ever since, along with metals such as gold, aluminium, and copper to produce semiconductor integrated circuit devices of extraordinary complexity and performance. Their successful miniaturization has meant that a great deal of functionality can be synthesized on a relatively small device. Additionally, these devices consume much less electrical power and operate at vastly greater speeds than the older valve and triode devices. Extra benefits have resulted from the perfection of the manufacturing processes as well, which has in turn lead to these devices becoming inexpensive to purchase and reliable in operation.

An entire industry of massive proportions has been supported by these developments, with its genesis in an area near San Francisco, California, which has since become known as Silicon Valley . Subsequently, other regions in Europe and Asianotably Japan and South Koreahave also established credibility in the mass-production of semiconductors.

For some time theorists and visionaries have proposed the idea that semiconductors might eventually be replaced in computers by devices that have the capacity to implement computer circuitry by using optics or quantum physical concepts, but these are yet to be proven beyond the research laboratory. Any replacement technology will need to possess very impressive credentials indeed if it is to be as operationally effective, as economical and efficient as devices implemented from semiconductors.

see also Microcomputers.

Stephen Murray

Bibliography

Hilton, Alice Mary. Logic, Computing Machines, and Automation. Washington, DC: Spartan Books, 1963.

Sedra, Adel S., and Kenneth C. Smith. Microelectronic Circuits, 4th ed. New York: Oxford University Press, 1997.

Wakerley, John S. Digital Design Principles and Practices, 3rd ed. Upper Saddle River, NJ: Prentice Hall, 2000.

Young, E. Carol. The New Penguin Dictionary of Electronics. Middlesex, England: Penguin Books Ltd., 1979.

Microchip

views updated Jun 11 2018

Microchip

Resources

Microchips, also termed integrated circuits or chips, are small, thin rectangles of a crystalline semiconductor, usually silicon, that have been inlaid and overlaid with microscopically patterned substances so

as to produce transistors and other electronic components on its surface. It is the components on the chip, not the chip itself, that are micro or too small see with the naked eye. The microchip has made it possible to miniaturize digital computers, communications circuits, controllers, and of many other devices. Since 1971, whole computer CPUs (central processing units) have been placed on some microchips; these devices are termed microprocessors.

Manufacture of a microchip begins with the growing of a pure, single crystal of silicon or other semiconducting element. A semiconductor is a substance whose resistance to electrical current is between that of a conductive metal and that of an insulating material such as glass (silicon dioxide, SiO2). This large, single crystal is then sawed into thin, disc-shaped wafers 412 inches (1030 cm) across and only.01.024 in (.025.06 cm) thick. One side of each wafer is polished to high precision, then processed to produce on it a number of identical microchips. These are cut apart later, placed in tiny protective boxes or packages, and connected electrically to the outside world by metal pins protruding from the packages.

To produce a microchip requires industrial facilities that cost billions of dollars and must be retooled every few years as technology advances. The basics of the microchip fabrication process, however, remain the same: By bombarding the surface of the wafer with atoms of various elements, impurities or dopants can be introduced into its crystalline structure. These atoms have different electron-binding properties from the silicon atoms around them and so populate the crystal either with extra electrons or with holes, gaps that behave much like positively charged electrons. Holes and extra electrons confer specific electrical properties on the regions of the crystal where they reside. By arranging the doped regions containing holes or extra electrons and covering them with multiple, interleaved layers of SiO2, poly-crystalline silicon (silicon comprised of small, jumbled crystals), and metal strips to conduct current from one place to another, each microchip can be endowed with thousands or millions of microscopic devices. Such chips are termed integrated because the electronic components in them are integral parts of a single, solid object; this both decreases their size and increases their reliability.

The microchip was conceived simultaneously in 1958 by US engineers Jack Kilby and Robert Noyce (19271990). In 1962, microchips were used in the guidance computer of the U.S. Minuteman missile (a nuclear-tipped intercontinental ballistic missile based in holes or silos in the American Midwest); the U.S. government also funded early microchip mass-production facilities as part of its Apollo program, for which it requiring lightweight digital computers. The Apollo command and lunar modules each had microchip-based computers with 32-kilobyte memories.

For some 40 years, the number of electronic components on an individual microchip has doubled every few years; this trend has been described as Moores Law ever since 1965, when U.S. engineer Gordon Moore described the beginning of the trend. Engineers continually strive to fit more electronic components on each microchip; however, this is becoming steadily more difficult as device dimensions decrease toward the atomic scale, where quantum uncertainty renders traditional electronics unreliable. Microchip engineers predict that by about 2020, the exponential increases of the last few decades will cease.

Since their advent, microchips have transformed much of human society. They permit the manufacture of small electronic devices containing many millions of components; they are essential to computers, missiles, smart bombs, satellites, communications devices, televisions, aircraft, spacecraft, and motor vehicles. Without microchips the personal computer, cell phone, calculator, Global Positioning System, and many other familiar technologies, both military and civil, would be impossible. As chip complexity increases and cost decreases thanks to improvements in manufacturing technique, new applications are continually being found.

See also Clipper chip; Computer hardware security; Computer keystroke recorder; Forensic science.

Resources

OTHER

Moore, Gordon. No Exponential is Forever. . .but We Can Delay Forever. International Solid State Circuits Conference, February 10, 2003. <http://download.intel.com/research/silicon/Gordon_Moore_ISSCC_021003.pdf> (accessed October 23, 2006).

Larry Gilman

Microchip

views updated May 23 2018

Microchip

LARRY GILMAN

Microchips, also termed "integrated circuits" or "chips," are small, thin rectangles of a crystalline semiconductor, usually silicon, that have been inlaid and overlaid with microscopically patterned substances so as to produce transistors and other electronic components on its surface. It is the components on the chip, not the chip itself, that are micro or too small see with the naked eye. The microchip has made it possible to miniaturize digital computers, communications circuits, controllers, and many other devices. Since 1971, whole computer CPUs (central processing units) have been placed on some microchips; these devices are termed microprocessors.

Manufacture of a microchip begins with the growing of a pure, single crystal of silicon or other semiconducting element. A semiconductor is a substance whose resistance to electrical current is between that of a conductive metal and that of an insulating material such as glass (silicon dioxide, SiO2). This large, single crystal is then sawed into thin, disc-shaped wafers 412 inches (1030 cm) across and only .01.024 inches (.025.06 cm) thick. One side of each wafer is polished to high precision, then processed to produce on it a number of identical microchips. These are cut apart later, placed in tiny protective boxes or packages, and connected electrically to the outside world by metal pins protruding from the packages.

Producing a microchip requires industrial facilities that cost billions of dollars and must be retooled every few years as technology advances. The basics of the microchip fabrication process, however, remain the same: by bombarding the surface of the wafer with atoms of various elements, impurities or "dopants" can be introduced into its crystalline structure. These atoms have different electron-binding properties from the silicon atoms around them and so populate the crystal either with extra electrons or with holes, gaps that behave much like positively charged electrons. Holes and extra electrons confer specific electrical properties on the regions of the crystal where they reside. By arranging the doped regions containing holes or extra electrons and covering them with multiple, interleaved layers of SiO2, polycrystalline silicon (silicon comprised of small, jumbled crystals), and metal strips to conduct current from one place to another, each microchip can be endowed with thousands or millions of microscopic devices. Such chips are termed integrated because the electronic components in them are integral parts of a single, solid object; this both decreases their size and increases their reliability.

The microchip was conceived simultaneously in 1958 by U.S. engineers Jack Kilby and Robert Noyce (19271990). In 1962, microchips were used in the guidance computer of the U.S. Minuteman missile (a nuclear-tipped intercontinental ballistic missile based in holes or silos in the American Midwest); the U.S. government also funded early microchip mass-production facilities as part of its Apollo program, for which it requires lightweight digital computers. The Apollo command and lunar modules each had microchip-based computers with 32-kilobyte memories.

For some 40 years, the number of electronic components on an individual microchip has doubled every few years; this trend has been described as Moore's Law ever since 1965, when U.S. engineer Gordon Moore described the beginning of the trend. Engineers continually strive to fit more electronic components on each microchip; however, this is becoming steadily more difficult as device dimensions decrease toward the atomic scale, where quantum uncertainty renders traditional electronics unreliable. Microchip engineers predict that by about 2020, the exponential increases of the last few decades will cease.

Since their advent, microchips have transformed much of human society. They permit the manufacture of small electronic devices containing many millions of components; they are essential to computers, missiles, "smart" bombs, satellites, communications devices, televisions, aircraft, spacecraft, and motor vehicles. Without microchips the personal computer, cell phone, calculator, Global Positioning System, and many other familiar technologies, both military and civil, would be impossible. As chip complexity increases and cost decreases thanks to improvements in manufacturing technique, new applications are continually being found.

FURTHER READING:

ELECTRONIC:

Moore, Gordon. "No Exponential is Foreverbut We Can Delay 'Forever'." International Solid State Circuits Conference, February 10, 2003. <ftp://download.intel.com/research/silicon/Gordon_Moore_ISSCC_021003.pdf> (April 3, 2003).

SEE ALSO

Nanotechnology

microchip

views updated May 29 2018

mi·cro·chip / ˈmīkrōˌchip/ • n. a tiny wafer of semiconducting material used to make an integrated circuit.• v. (-chipped, -chip·ping) [tr.] implant a microchip under the skin of (a domestic animal) as a means of identification.

More From encyclopedia.com