Physics: The Standard Model, String Theory, and Emerging Models of Fundamental Physics

views updated

Physics: The Standard Model, String Theory, and Emerging Models of Fundamental Physics

Introduction

One of the crowning achievements of twentieth-century physics is the standard model of particle physics, an attempt to construct a complete description of all fundamental particles that exist in the universe and their interactions with one another. While the standard model represents a stunning success of the methods of modern physics and stands as a monument to the complex interplay between theory and experiment, it still leaves many questions about the nature of matter unanswered.

Historical Background and Scientific Foundations

One of the oldest goals of science has been to understand the fundamental structure of matter. This search began in the ancient world, when philosophers such as Democritus (460–370 BC), Epicurius (341–270 BC), and Lucretius (fl. first century BC) suggested that if you divide an object into smaller and smaller pieces, eventually you would arrive at an entity that could no longer be divided. These entities were called “atoms,” from the Greek atomos, meaning indivisible.

The question of whether or not atoms were real remained a philosophical one for the next 1,500 years. But in the eighteenth century the emerging fields of physics and chemistry began to provide concrete evidence for their existence. Physicists began to realize that the properties of matter (such as heat) were the result of tiny moving particles. And chemists discovered a number of compounds that seemingly could not be broken down into other substances. The periodic table of the elements, first formulated by Dmitry Ivanovich Mendeleyev (1834–1907) between 1868 and 1870 was basically a catalog of all of the atoms then known to exist in nature. In the early part of the twentieth century, this picture began to change as “indivisible” atoms were found to be composed of even smaller particles. The quest for the fundamental building blocks of matter was only beginning.

Parts of the Atom

The first fundamental particle that was identified as such was the electron. In 1897 British physicist Joseph John Thomson (1856–1940) was working with a recently invented device called a cathode ray tube—essentially a primitive version of the picture tubes used in early television sets. In studying the behavior of cathode rays, Thompson found that their behavior was that of a beam of tiny negatively charged particles. These “corpuscles” of electrical charge, as he called them, were much lighter than any known atom, and it seemed they could be generated from any kind of material. This led scientists to conclude that each atom had within it some number of these tiny charged particles—the first hint that atoms were neither elemental nor indivisible, but composed of even smaller particles.

If an atom included negative charges, it must also include positive charge, since matter is generally neutral. The earliest theory of atomic structure, held by Thompson and others, was the “plum pudding” model, which envisioned the atom as a uniform sphere of positive charge with tiny chunks of negative charge, electrons, embedded within it. This picture, however, was shattered in 1907 by New Zealand-born English physicist Ernest Rutherford (1871–1937).

In Rutherford's experiment, thin pieces of metallic foil were bombarded with alpha particles from a radioactive source. If the atom was a soft, uniform, positively charged sphere, most of the positively charged alpha particles would be only slightly deflected as they passed by or through the atoms. Instead, Rutherford and his collaborators, the German physicist Hans Geiger (1882–1945) and British-born New Zealand physicist Ernest Marsden (1889–1970) found that most of the alpha particles passed straight through the foil, while a very small number of them bounced almost straight back, as if they had encountered something solid. This led Rutherford to propose a new model for the atom in which the electrons moved about in a region of nearly empty space, while its positive charge was entirely concentrated in a tiny central region called the nucleus.

By 1918 subsequent experiments allowed Rutherford to identify the fundamental unit of positive charge as the proton. The other important component of atomic nuclei, the neutron, was discovered in 1932 by British scientist Sir James Chadwick (1891–1974). These three particles—the proton, the neutron, and the electron—gave scientists a more complete and fundamental description of the properties of the elements in the periodic table. These 90 or so elements were no longer thought to be elemental at all. Instead, each type of atom represented some combination of protons, neutrons, and electrons.

It was clear that this model of the atom, a tiny nucleus packed with positively charged protons, demanded a new kind of physical explanation. If the only force at work in the atom was that of electrical attraction and repulsion, the nucleus should fly apart due to the repulsive force between the protons. The existence of the nucleus suggested that some force bound protons and neutrons together.

Into the Nucleus

Experiments in the 1930s and 1940s used the first primitive particle accelerators to probe the structure of the nucleus and discover the force that held protons and neutrons together. But rather than simplifying the structure of matter, these experiments soon revealed a subatomic universe that was surprising and complex.

The first new particle, the muon, was discovered in 1936. Initially thought to be a new type of nuclear particle involved in the force between protons and neutrons, it was later found to have none of the properties that would be expected of such a particle, prompting the Hungarian-born American theoretical physicist Isidore Isaac Rabi (1898–1988) to ask jokingly, “Who ordered that?” In 1947 pions explained many of the processes taking place inside the atomic nucleus. That same year, however, the lambda was found to be heavier than the proton and neutron—another surprise. The kaon was discovered in 1949, delta particles in 1952. By the middle of the 1960s, more than a hundred new particles had been discovered. Rather than simplifying our understanding of matter, they revealed a vast and complicated array of previously unknown particles.

The Science: The Electron, Muon, and Tau

The muon, a particle with properties that are nearly identical to the electron, was discovered by American physicist Carl David Anderson (1905–1991) in 1936. With a mass more than 200 times greater than the electron, the muon is unstable, existing only for a few millionths of a second on average before decaying into an electron and neutrinos. But in every other respect—charge, spin, interactions with other particles—it behaves just like an electron. In fact, scientists have even created short-lived “muonic atoms” whose electrons were replaced by muons.

Another member of this group was discovered in the mid–1970s. The tau has a mass more than 3,700 times the electron. Its lifetime is also very short, typically less than a trillionth of a second. Along with the electron, the muon and tau make up a group of particles that physicists call leptons, from the Greek for small, thin, or delicate. Each lepton has an associated neutral partner, a particle with no electrical charge, called a neutrino.

The Neutrino

Neutrinos were first proposed as a solution to a nagging observation that plagued nuclear physicists in the very early part of the twentieth century. In 1911 Austrian-born physicist Lise Meitner (1878–1968) and German chemist Otto Hahn (1879–1968) were studying the radioactive process of beta decay, a process in which one of the neutrons in an atomic nucleus is transformed into a proton plus an electron, which is emitted from the nucleus as a particle of radioactivity. When the energy of the emitted electrons was measured, the process seemed to violate the law of conservation of energy because a tiny bit of energy seemed to be “lost” in the process. This was puzzling, since the idea that energy is conserved in all physical processes was—and still is—absolutely fundamental to our understanding of the universe.

A solution was proposed in 1930 by the Austrian-born physicist Wolfgang Pauli (1900–1958), who suggested that a small, light, neutral particle might also be produced in beta decay, and that this particle was escaping the experiment undetected, carrying with it some of the missing energy. Some years later, this hypothetical particle was named the neutrino by Italian-born American physicist Enrico Fermi (1901–1954), but more than two decades would pass before its existence was finally confirmed experimentally in 1956.

The neutrino interacts so weakly with ordinary matter that it typically passes through solid objects unaffected. Of the trillions of neutrinos that strike every square inch of Earth each second, most of them pass right through the entire planet and emerge from the ground on the other side; only about one out of every hundred billion is stopped by interacting with another particle along the way. Modern neutrino detectors, despite being much more sensitive than those used to discover neutrinos in the 1950s, still detect only a small handful every day. And yet their study is of great value to physicists and astronomers, giving them insight into the fundamental interactions of matter, as well as processes that occur in the core of the sun and distant exploding supernovas. Today, we know that there are three types of neutrinos. The original neutrino postulated by Pauli is known as the electron neutrino, but there two other types or “flavors”: the muon neutrino, discovered in 1962, and the tau neutrino, discovered in 1975. Each is paired through its interactions with one of the charged leptons: the electron, muon, and tau.

One question about the neutrino that was unanswerable until very recently was whether or not they were massless, like a photon, or just very, very light. The main observable difference between a massless neutrino and one with mass is that the latter can change or “oscillate” from one flavor to another. Recent measurements suggest that some of the electron neutrinos produced in the sun's core transform into muon neutrinos during the 93-million-mile trip from the sun to Earth. This suggests that neutrinos do in fact have a small mass, although the experiments are not yet sensitive enough to measure it accurately.

Antiparticles

In 1927 British theoretical physicist Paul Dirac (1902–1984) was working to combine the quantum mechanical description of the electron with Albert Einstein's (1879–1955) theory of relativity. The result, now known as the Dirac equation, stated that the electron seemed to possess positive energy states in addition to its usual negative energy. This was eventually interpreted to mean that another particle should exist, identical to the electron in most respects, but with the opposite electrical charge. The positron was discovered in 1932 by Carl Anderson (1905–1991), who also discovered the muon.

We also know now that nearly all particles possess antiparticles. There are anti-muons and anti-taus, as well as antineutrinos and antiquarks. Some uncharged particles, however, such as the photon and the Z, are their own antiparticles.

The Rise of the Quark Model

Patterns in the masses, charges, and other properties of mesons (subatomic particles composed of a quark and an antiquark) and baryons (subatomic particles composed of three quarks, or three antiquarks) led physicists to speculate that the particles had some sort of underlying structure, not unlike the way the periodic table explains the elements as assemblages of protons, neutrons, and electrons. In 1961 American physicist Murray Gell-Mann (1929–) proposed that the proton, neutron, and other heavy particles were made of smaller particles that he named quarks. Gell-Mann took the name from a line in James Joyce's novel Finnegan's Wake: “Three quarks for Muster Mark. Sure he hasn't got much of a bark. And sure any he has it's all beside the mark.” Gell-Mann's model, which consisted of three quarks—the up quark, down quark, and strange quark—explained the number of hadrons (subatomic particles that are composed of two or more quarks or antiquarks, and that take part in the strong force interaction—both baryons and mesons are hadrons) perfectly, as well as the pattern of their masses, charges, and other properties.

At first only three quarks were necessary to explain all of the known heavy particles. These particles had fractional electrical charges compared to the electron and proton, and could be combined in groups of two or three to form all of the known baryons and mesons. For example, the up quark has a charge of +⅔, while the down quark has a charge of −⅓. A combination of two ups and a down gives a charge of +⅔ +⅔ −⅓ = 1, which corresponds to a proton. A combination of two downs and an up, on the other hand, gives an electrical charge of ⅔ −⅓ −⅓ = 0. So two downs and an up make a neutron. Lighter mesons like the pion, kaon, and others were explained as a pair that consisted of a quark and an antiquark.

The quark model successfully explained the properties of all known hadrons in the 1960s, although some theoretical considerations suggested there could be more than three quarks. In 1974 the discovery of the J/Psi particle required the addition of a fourth quark to the model, the charm. This added some symmetry to the model, since now there were two quarks with a +⅔ charge (the up and the charm) and two with a −⅓ charge (the down and the strange). In 1977 a fifth quark was discovered—another −⅓ charged quark that was eventually named the bottom. Physicists were sure that a sixth would be found to complete the picture, a heavy quark with a +⅔ charge named the “top.” Decades of searching finally bore fruit when the top quark was produced in 1995 at the powerful Tevatron particle accelerator at Fermilab near Chicago.

Standard Model Fermions

The six leptons (the electron, muon, tau and their partner neutrinos), six quarks (up, down, strange, charm, bottom, top), and their antiparticles are the fundamental constituents of all matter in what has come to be called the Standard Model of particle physics. These particles share one common property: They have the same quantum mechanical “spin,” meaning they all carry a quantity of angular momentum equal to fi Planck's constant. Physicists call this group of “spin fi” particles fermions. But fermions make up only half the particle physics picture. Another group of particles in the standard model are related to the forces between particles of matter.

Exchange Forces and the Photon

The first force described by particle physicists was the electrical force, recognized since antiquity. The word “electron” comes from the Greek word for the substance amber, since rubbing amber with a cloth was known to generate “static electricity” that attracted other small light objects. Electrical forces, like gravity, were seen as mysterious invisible fields that acted through empty space. While the strength and other properties of such forces were well understood, what was lacking was an understanding of the mechanism of how they occurred.

Quantum theories helped shed some light on the mechanism of electrical interactions between particles. Since 1905 quantum theories had suggested that the interaction between particles and light happened in a discontinuous fashion, rather than a smooth continuous absorption of energy, as older theories of electromagnetism had suggested. Albert Einstein coined the term “photon” for a chunk of light energy, essentially a “particle of light.” The emission and absorption of photons by electrons and other charged particles eventually led to the idea that the mechanism for electrical attraction and repulsion was an exchange of photons between the charged particles.

Two electrons exchanging photons with one another would move away from each other like two people on skateboards tossing a heavy object back and forth between them. These photons were called virtual photons since they existed only for a brief time before being reabsorbed by the other particle. The energy that creates virtual photons can be thought of as being borrowed for a brief time from the slight energy fluctuations allowed by the Heisenberg uncertainty principle, which states that it is not possible to know a particle's location and momentum precisely at the same time. A complete theory of the interactions between charged particles via photon exchange was finalized in the 1940s and given the name quantum electrodynamics or QED. The basic idea of particle exchange as the carrier of force between particles was useful for explaining other natural forces as well.

Quarks, Gluons, and the Strong Force

The idea that protons, neutrons, and other hadrons were made of more fundamental particles called quarks required an explanation of the force that binds quarks together. A theory of this interaction would have to explain some of the quarks' curious features, such as why they only appear in groups of two and three, and that no isolated quark had ever been detected. The force between quarks is known as the strong force, because it is obviously stronger than the force of electrical repulsion that pushes apart the like-charged quarks inside a proton and neutron.

Since a typical hadron contains three quarks, the charge responsible for the strong force was also deduced to come in three varieties, instead of the two (positive and negative) that create the electrical force. The new strong charge was named “color charge,” and its three varieties were called red, green, and blue because they combine to make a neutral combination, just like the three primary colors combine to make a colorless white light.

The particle that carries the strong force between quarks is called the gluon. Unlike the photon, which transmits electrical forces between electrical charges but has no electrical charge of its own, the gluon has color charge. And, just like the quarks, it feels the strong force. This is one reason that the strong force is so strong, and the reason that no one has ever seen an isolated quark. The strong force actually gets stronger with distance, so that the quarks in a proton are trapped for good, a feature of the theory known as confinement.

The complete theory of quarks, gluons, and the strong force, known as quantum chromodynamics or QCD, is the result of the work of a number of physicists in the 1960s and 1970s. QCD turns out to be very challenging mathematically, and while it is difficult to produce exact results for many problems, the theory has successfully described the interactions of quarks.

The Weak Force

The existence of the neutrino, a neutral particle that interacts with matter only very weakly, suggested that there was some new force at work between neutrinos and the other particles of ordinary matter. Dubbed the weak force, it is felt by all particles of matter, regardless of their charge, but is associated most closely with neutrinos, since it is the only force they feel (other than gravity, which is weaker still).

The weak force is carried by a trio of particles called weak bosons: the W+, W−, and Z0. These force-carrying particles are among the most massive in the standard model, and their large mass is responsible for the weakness and the short range of the weak force. The Heisenberg uncertainty principle dictates that the more energy and mass a force-carrying virtual particle has, the shorter its life span and the shorter the distance it can travel between particles.

The full theory behind the weak force arose in the mid to late 1960s through the independent work of American theoretical physicist Sheldon Glashow (1932–), Pakistani nuclear physicist Abdus Salam (1926–1996), and American nuclear physicist Steven Weinberg (1933–). The complete theory, called electroweak theory, explained the action of the weak force and predicted the existence of W and Z bosons, which were not discovered experimentally until 1983 at

European Organization for Nuclear Research (CERN) in Switzerland.

Besides governing any interaction involving neutrinos, the weak force plays a role in any process where a quark changes flavor from one type to another. For example, in beta decay (the process that led physicists to first postulate the existence of the neutrino) a neutron must change into a proton. This requires a down quark to change into an up quark. This can occur if the down quark emits a W−, changing its charge from −⅓ to +⅔. (The W− then decays into an electron and a neutrino, or, specifically, an antielectron neutrino.) This type of interaction occurs in any nuclear process in which neutrons turn into protons, or vise versa, including many types of radioactivity, and in the process of nuclear fusion.

Mass and the Higgs Boson

W and Z bosons are unique among force-carrying particles because they have mass, while the photon and gluon do not. One mechanism for generating a mass for particles is known as the Higgs mechanism, named after the British physicist Peter Higgs (1929–), who invented a mathematical method called spontaneous symmetry breaking that makes the mechanism possible. The method invokes a new sort of field, the Higgs field, which exists at every point in space. As particles move through space, their constant interactions with this Higgs field produce the sort of resistance to motion that we typically associate with the property called mass, much like the resistance we feel when wading though deep water in a swimming pool.

The particle responsible for this interaction is known as the Higgs boson, the last major missing piece of the standard model puzzle. Although it is an essential part of the theory, no experiment has ever detected the Higgs boson directly. Based on the range of possible Higgs masses predicted by the standard model, the next generation of more-powerful particle accelerators, such as the large hadron collider, is expected to achieve the energies necessary to verify its existence.

Influences on Science and Society

The search for a theory of fundamental particles stretches from Thomson's discovery of the electron in 1897 to the discovery of the top quark in 1995 to the modern search for the Higgs boson—more than 100 years of experimental and theoretical work. Since 1950 more than twenty Nobel Prizes in physics have been awarded for work that is either directly or indirectly related to the development of the modern standard model. During this time, experiments that could be conducted by one or two scientists with some simple laboratory equipment have been replaced by “big science”—huge facilities costing billions of dollars with dozens of scientists and hundreds of technical and support personnel responsible to gather each new piece of data. This is true in all of the sciences, but it is particularly so in experimental particle physics, where new accelerators can cost billions of dollars. The standard model continues to drive inquiry in basic physics, not only because of its successes, but because of its failures as well. There are many things that the standard model does not do, or at least does not do to the satisfaction of many particle physicists.

What About Gravity?

One thing the standard model does not do is include a description of the force of gravity. As far as we can tell there are only four forces in nature, and the standard model deals with three of them: the electromagnetic force, the strong force, and the weak force. (Gravity is handled by Einstein's general theory of relativity, which describes forces in a very different way.) The standard model is a quantum field theory, that is, it explains forces in terms of exchanges of quanta of fields, such as the photon and the gluon. But general relativity considers gravity to be a result not of particle exchanges, but of the curvature of space-time.

Many physicists view this situation—where three of the forces of nature are treated with one set of mathematical tools, while a completely different approach is used for the fourth—to be unsatisfactory. They believe that there is a deeper theory, a “unified” theory, that would provide a single description of all the forces of nature. So far, attempts to create a fully unified theory of gravity and quantum physics have proven unsuccessful.

Supersymmetry

Another unexplained feature of the standard model is that, while fermions are free to interact in ways that allow them to transform from one type of matter particle into another, there are no interactions that transform fermions into bosons or vice versa. For a number of reasons, it is tempting for theoretical physicists to postulate a whole new set of particles and reactions that would bridge the gap between bosons and fermions and to construct a theory that was fully symmetrical in its treatment of the two classes of particles. This extension of the standard model, known as supersymmetry, postulates that every particle in the standard model has a supersymmetric partner on the other side of the fermion-boson divide. So spin fi electrons and neutrinos have spin 1 partners called the selectron and the sneutrino. Likewise, the spin 1 superpartners of the quarks are known as squarks, and the photon and gluon have spin fi partners as well—the photino and the gluino. None of these supersymmetric particles have ever been detected.

Supersymmetry may seem to add unnecessary complexity to the standard model by postulating additional particles for which there is no evidence. But the addition of the new supersymmetric particles actually solves a number of important theoretical and mathematical puzzles in the standard model, leading many physicists to trust that it is a correct theory that will be vindicated as the next generation of more powerful particle accelerators begins to produce the superpartner particles in the coming decades. Only time will tell.

Other Questions

Another weakness of the standard model is its inability to answer a number of questions. Why are there six leptons and six quarks? Why do the particles have the particular masses, charges, and other properties that they do? What determines the relative strengths of the various forces of nature? In addition, the number of free parameters in the standard model—numbers that are not predicted by the theory but must be determined experimentally and entered into equations by hand—is quite high. There is a sense among some physicists that a complete theory of fundamental particles should explain all these particle properties from basic principles, without reliance on parameters that just happen to have some particular value.

A related issue is that the number of fundamental particles seems to be quite large: six leptons, six quarks, and their antiparticles; the photon, the W, and the Z, and eight colors of gluons; and perhaps a supersymmetric partner for each of those. If the age-old quest to explain the structure of matter is an attempt to describe nature in terms of some very small number of basic constituents, the standard model does not seem to pass this test.

Unification and String Theory

The desire to unify our description of quantum forces with our understanding of space-time and gravity, and to simplify our picture of dozens of particles, antiparticles, and superpartner particles into a theory with some very small number of entities has led to a group of approaches known collectively as “string theory.” The idea behind string theory is that the universe actually has more than three dimensions. Just like a piece of paper has a large width and height but a very small thickness, so the universe might have the three very large dimensions that we normally experience, but many other “thicknesses” in dimensions that are visible only at sizes at the scale of elementary particles. In string theory, fundamental particles are treated not as tiny zero-dimensional points but as loops or “strings” that wrap around these tiny hidden dimensions. The most commonly discussed string theories have six additional space dimensions, making the universe ten dimensional—with three large space dimensions, six small space dimensions, and one time dimension.

In string theory, there is only one fundamental object: the string. But depending how the string is situated or vibrating in the various dimensions, it can have different properties when viewed from our three-dimensional

vantage point. No physicist has yet figured out how to construct a specific version of string theory that describes the particular particles we find in the universe. The number of possible ways to deal with the extra dimensions is very large, perhaps as large as 10 500 by some calculations. The astonishing number of different possible string theories suggests that finding the “right” one for the universe could be very difficult. But the ease with which string theory accommodates standard model physics (including supersymmetry and gravity) make it an intriguing avenue of research that has already given physicists some important insights into the problem of unification.

Not all physicists believe that string theory is the right approach to unification. Many feel that its large number of possible configurations means that we will never be able to find the correct set of string theory equations to describe the universe. Others doubt that string theory will ever be able to make any concrete predictions that are can be tested. If a theory never makes any testable predictions, then it is arguably not a scientific theory at all.

Modern Cultural Connections

The quest to probe the fundamental structure of the universe more and more deeply leads us to confront not only questions of how and when we know a theory is a good one, but forces us to examine our priorities as a society when it comes to investment in scientific research. While potential practical benefits could result from almost any scientific endeavor (PET scans and MRIs, for example, are two valuable medical diagnostic tools that owe their existence to our understanding of the properties of matter on the subatomic level), as we examine more and more esoteric phenomena at higher and higher energies, the motivation becomes the desire for knowledge for its own sake rather than the quest for new technologies or new applications. One can reasonably question whether such discoveries are worth the price tag, especially considering the rising cost of the experimental facilities needed to conduct research at the cutting edge of particle physics. In the United States, much funding for pure research of this nature comes from the federal government. Thus the question of what scientific research gets funded can become a political question as much as a scientific one.

An example of the conflict between pure science, government funding, and changing political landscapes can be found in the story of the superconducting super collider (SSC), proposed in 1983 as a powerful next-generation particle accelerator, designed to collide beams of protons at energies far exceeding the most powerful accelerators of the day. The goal of the SSC was to detect the Higgs boson and to explore the possibility of unknown phenomena beyond the standard model, such as supersymmetry. The project required a huge financial investment from the government. Initial estimates suggested that the project would cost more than $3 billion. Despite the cost, in 1987 the federal government decided to approve construction of the SSC at a location in Texas.

Construction began in 1991, but the project was never completed. A number of factors, including mounting budgets (the cost of the project ballooned to more than $11 billion) and changing administration priorities led Congress to shut down the SSC project in 1993. To Congress, the SSC represented a costly program that was an easy target for budget cuts. But physicists considered its cancellation a tragic blow to basic physics research. Despite such missteps, particle physics research continues, often with the help of international collaborations.

The most recent new particle accelerator, the large hadron collider at CERN (The European Organization for Nuclear Research), is scheduled to begin operation in 2008. And physicists are already planning more energetic machines, such as the international linear collider, a global effort scheduled for well into the 2010s. These new machines will probe the standard model at higher and higher energies and test its predictions to greater and greater accuracy. Physicists hope to see something new (and perhaps even unexpected) at these higher energies that would indicate new physical phenomena beyond the standard model—perhaps evidence of super-symmetry or extra dimensions, or even something that no one has thought of yet. But even if cracks do appear in the standard model, it will be impossible to deny its importance in shaping our view of the fundamental structure of matter, and the universe.

Primary Source Connection

The following article was written by Rushworth M.

Kidder, a senior columnist for the Christian Science Monitor until 1990, when he founded the Institute for Global Ethics. Kidder is the author of Reinventing The Future—Global Goals For The 21st Century. Founded in 1908, the Christian Science Monitor is an international newspaper based in Boston, Massachusetts. In his 1988 article, Kidder describes how string theory seized the imagination of the physics community.

STARTLING STRINGS

HOLD a tiny rubber band 10 inches from your eye. It appears to be what it is—a closed loop of rubber. Put it 10 yards from your eye. Now it appears to be a dot, a single point whose features are indiscernible. Put it 10 miles from your eye. You see nothing at all. That rubber band, many physicists now agree, is like a fundamental particle of matter. The naked eye can't see such particles at all. They're simply too small.

How can you see them at closer range? A particle accelerator helps. Like a giant microscope, it uses high energies to let you “see” tiny particles. Even that, however, doesn't provide the high resolution needed to see the outlines of the rubber band. It sees the rubber band, all right—but as a dot, not a loop.

But what if you had an accelerator that generated the kind of energy available at the big bang? It would be like seeing a particle up close through a mammoth microscope.

And would those dots still be dots?

No, says string theory, a new and promising way of looking at matter that in the last five years has seized the imagination of the physics community. Those dots, say string theorists, only look like dots. In fact, they're really like rubber bands—or, more accurately, loops of string. What's more, the 60 or so particles (depending on who's counting) in the “particle zoo” that makes up matter turn out to be, in this theory, different manifestations of a single loop-shaped object.

Then why do all these particles—quarks and leptons, anti-quarks and anti-leptons, and all the rest—seem to be so different? According to an extension of string theory called superstring theory—which incorporates a new symmetry called “supersymmetry”—the differences arise because these loops vibrate in different ways. Set a loop oscillating in a certain way, and it will appear to have certain properties. Seen from a distance, it might appear to be a “charmed” quark. With a different oscillation, the same string might seem to be a muon. If we could only see it up close, however, it would reveal itself to be just another dancing, jiggling, rolling pattern being played on a one-size-fits-all loop.

If that sounds strange, imagine a blind Martian coming to earth and hearing a one-stringed violin. He could be forgiven for thinking that the instrument had dozens of strings—one for each note he hears. In fact, all those notes come from just the one string. The secret? Each note results from a different vibration of the string.

So promising is the superstring theory, physicists say, that it appears capable of embracing a vast array of phenomena and explaining them as parts of a splendid, well-balanced whole. Even the four forces of nature—gravity, electromagnetic force, the strong force that binds protons and neutrons into the nucleus, and the weak force responsible for radioactive decay—may prove to be different manifestations of a single force. Physicists refer to supersymmetry, only half jokingly, as the TOE—the theory of everything.

But the TOE is not without its problems. “Right now,” says Rockefeller University physicist Heinz Pagels, “it's turned out to be a theory of nothing. By that I mean that, although it's extremely elegant both conceptually and mathematically, it has failed to make contact not only with experiment, but with the ordinary theories that we now know describe experiment.”

The problem, in part, is one of scale. These strings or loops, according to the theory, are just 10 to the -33 centimeters long. That means that it would take a million of them—multiplied by a billion, then by another billion, then by still another billion—to add up to a centimeter. Even the mammoth superconducting supercol-lider—the SSC, currently proposed by the United States Department of Energy—will develop energies that will “see” only to the range of 10 to the ‒15 centimeters.

Moreover, such strings can't exist in our ordinary three-dimensional universe. “It appears that the superstring theory implies that space-time is 10 dimensional,” says cosmologist Michael Turner. That means nine spatial dimensions and a 10th in time. In our everyday world, he notes, “we know of three spatial dimensions. That would say that we missed twice as many.”

Where are they? The best explanation, apparently, is that they're somehow “folded up” within the three-dimensional world, like leaves waiting to mature.

And then there's the mathematics, widely described as immensely challenging. So complex is it, in fact, that superstring theory has had to await the discoveries of highly elaborate mathematics for it to progress. Result: There has been a plethora of solutions to these theoretical problems, many of which claim to be good descriptions of the universe.

Despite the obstacles, however, physicist John Schwarz—who, with British physicist Michael Green, is one of the founders of string theory—is convinced that there are now only three possible string theories that could be right.

“Once you've said which of those theories is the right one,” he explains across the cluttered desk in his office at the California Institute of Technology, “you've given a completely unambiguous fundamental theory of nature. And now all you have to do to describe all of physics is to solve the equations.”

“That's, of course, the hard part,” he adds with a chuckle.

Rushworth M. Kidder

kidder, rushworth m. “startling strings.” christian science monitor (june 16, 1988).

See Also Physics: Heisenberg Uncertainty Principle; Physics: QED Gauge Theory and Renormalization; Physics: Radioactivity; Physics: Special and General Relativity; Physics: The Quantum Hypothesis.

bibliography

Books

Greene, Brian. The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory. New York: W.W. Norton, 1999.

Oerter, Robert. The Theory of Almost Everything: The Standard Model, the Unsung Triumph of Modern Physics. New York: Pi Press, 2006.

Periodicals

Kidder, Rushworth M. “Startling Strings.” Christian Science Monitor (June 16, 1988).

Mervis, Jeffrey. “10 Years After the SSC: Scientists are Long Gone, but Bitter Memories Remain.” Science 302, no. 5642 (October 3, 2003): 40–41.

Web Sites

Lawrence Berkeley National Laboratory, Particle Data Group. “The Particle Adventure.” http://www.particleadventure.org (accessed November 10, 2007).

Nobel Foundation. “Solving the Mystery of the Missing Neutrinos.” http://nobelprize.org/nobel_prizes/physics/articles/bahcall/index.html (accessed November 10, 2007).

David L. Morgan

More From encyclopedia.com

About this article

Physics: The Standard Model, String Theory, and Emerging Models of Fundamental Physics

Updated About encyclopedia.com content Print Article

You Might Also Like