Special Effects

views updated May 11 2018

Special Effects

PHYSICAL EFFECTS
OPTICAL EFFECTS
THEORETICAL CONSIDERATIONS
FURTHER READING

Special effects in cinema can be divided into physical and optical effects (in the industry often referred to as "effects" and "special effects," respectively), the former done in front of the camera, the latter after the negative has been exposed. Unfortunately, this neat distinction breaks down over some optical effects that are produced by double exposures of the film strip or rear projection during shooting, and increasingly in the use of physical ("practical") elements as resources in digital postproduction. Effects are most commonly associated with creating images of scenes, events, and characters that do not exist in the real world or that cannot be photographed, but they are also used for economic reasons. Cost is both a stimulus to and a major constraint on the use of special effects. Closely related to the cost factor are time constraints, and increasingly the physical capacity of computer processors. Many effects techniques have been designed expressly to increase the temporal and computing efficiency of complex sequences. Despite much recent press criticism of Hollywood blockbuster films, it is relatively rare for a film to be promoted exclusively for its special effects; nevertheless, many films depend on effects for their appeal.

The crucial qualities sought by most effects professionals are believability and innovation: the phrases "special effects" and "cutting edge" are difficult to disassociate, providing the profession with its greatest single challenge. At the same time, while taking pride in their craft, effects professionals commonly refer to the subordination of special effects to the narrative demands of the project, and are particularly sensitive to the possibilities of creating creatures, objects, and locations with distinctive personalities.

PHYSICAL EFFECTS

Physical effects are created by several types of professionals, the most celebrated of whom are stuntpeople. Such work demands both athleticism and skilled training, often in specialized areas that include work with cars, animals, or dangerous environments. These effects also require the work of specialized riggers and prop makers. The former provide tools such as wirework rigs for flying and falling, small ramps to make cars flip over, various types of safety harnesses and mats onto which stuntpeople can fall, and other similar devices. Prop makers are responsible for sugar-glass tableware, breakaway furniture, lightweight or rubber weapons, and similar items. Also involved in many stunts are specialists in the training and handling of animals ("wranglers"), pyrotechnics experts (responsible for fire effects), and set designers. Though many stunts are performed on location, others have to be staged on specially built sets, so that the design of the sets must accommodate the performance of the stunt while providing for the stuntperson's safety. The set designer must also create positions for cameras, since many stunts are "oncers," that is, actions that can be performed only once, either because a portion of the set has to be destroyed, or because the action is too risky to perform over and over. Thus multiple cameras are needed, each of which must have a good "eyeline" on the action while remaining hidden from the other cameras. Filming stunts often requires the use of different camera speeds from the standard twenty-four frames per second of normal cinematography. During the "Battle on the Ice" sequence in Alexander Nevsky (1938), for example, Edouard Tissé, Sergei Eisenstein's cameraman, shot at speeds reported at fourteen frames per second, giving the effect of speeding up the action when replayed, but elsewhere over cranked the cameras to slow down smaller actions, in order to give the impression that the lightweight swords were in fact heavy battle weapons. Wounds can be simulated using gelatine sacs of fake blood or pumps, by firing gelatine caps or blood-soaked swabs at stuntpeople, or by exploding small charges ("squibs") of blood and meat painted into or under the performers' clothes (an effect extensively used in The Wild Bunch, 1969).

An example of a scene that is impossible to shoot occurs in The Perfect Storm (2000): an unrepeatable meteorological event, far too dangerous for filming even if it could be repeated, and mostly occurring in pitch darkness. To re-create the drama of the crew of one trawler, director Wolfgang Peterson's crew built a large tank containing an industrial gimbal on which was mounted a full-scale replica of the ship. As the boat was tossed in the tank and crew members directed high-pressure hoses onto the actors, massive shipping containers converted into water tanks dumped thousands of gallons of water onto the set. Shot in Steadicam for close-ups and against bluescreen (large sheets of a specific shade of blue which, used as a reference tone, can be removed from the image and replaced with other footage, giving the impression that the live action takes place in remote or imagined settings) for wide shots, the scene would be darkened in post-production, illuminated by occasional flashes of artificial lightning. Sometimes the impossibility of a shot is not physical but political or financial, and many films either use roughly similar buildings to emulate famous sites across the world, or build them in whole or in part as sets.

Likewise, miniature sets fall in the domain of the effects department. Not only do miniatures require detailed modeling: they create particular lighting demands. As every model train enthusiast knows, trees do not have the same structure as twigs. A specific challenge for miniatures is water, which acts very differently at smaller and larger scales, and is frequently mixed with milk and other liquids to break up the surface tension and to provide a better response to light. Miniature passes including water are often backed up with a pass for which the water is replaced with a reflective material like mylar to provide reflections of the surroundings, and two or more passes are then combined in postproduction to create the final effect. Miniature fire likewise acts differently from large fires, and must be tricked: a common device is to use two light bulbs of a suitable color near each other, flicking them on and off to produce the play of firelight. Other sets, such as the Minas Morgul miniature for The Lord of the Rings: The Return of the King (2003), use fluorescent paints, and have to be shot not only using standard key and fill lights but ultraviolet illumination to bring out the unnatural colors. Miniature passes are frequently shot using smoke to obscure defects in the model or to allow for the compositing of the miniature shot with other elements. Smoke too acts differently at different scales, and specialized fumes are used for this purpose.

The talismanic use of miniature photography is most associated with the careers of Willis H. O'Brien (1886–1962) and Ray Harryhausen (b. 1920), especially the former's The Lost World (1927) and King Kong (1933), and the latter's Sinbad cycle. These films depend upon stop-motion cinematography, in which models built on articulated armatures, usually of light steel rods, are physically moved fractionally between frames in a miniature set. The result may look jerky to contemporary eyes but is widely cited as inspirational by a number of modern effects professionals. Particularly delightful is the constant ruffling of King Kong's fur as he is manhandled. During the 1970s and 1980s, advances in control systems made possible the rapid development of both human-operated puppets (for example, those from Jim Henson's [1936–1990] Creature Shop, which created the Muppets and many others), especially larger puppets requiring servo-motors to amplify the puppeteer's movements, and pure animatronic, robot-like puppets controlled remotely. A director who has used the technique widely is Steven Spielberg (b. 1946), whose Jaws (1975) is still frightening, and who developed convincing (and waterproof) dinosaur animatronics for The Lost World: Jurassic Park (1997). Consistency of lighting, relation to the rest of the miniature set, and the establishment of believable spatial relations between elements in the shot are critical factors in developing effective stop-motion sequences. In recent miniature cinematography, the key advances have included the development of methods for moving the miniature camera, and the evolution of the snorkel lens, which, as its name suggests, uses reflection to bring the lens far closer to the miniature. Mobile shots of miniatures, such as shots of fighting vessels in Master and Commander: The Far Side of the World (Peter Weir, 2003) were not possible in earlier effects films, where issues of parallax and the matching of camera moves between miniature and live-action shoots were far more difficult.

The problem of matching camera moves was considerably eased with the arrival of motion control. A computer installed in proximity to the camera records its motions relative to the tripod, as well as laterally, in relation to the physical space in which it may be dollied or tracked. The recording is then used to drive either a second pass through the same space, or to replicate a shot initiated in a studio at a remote location, or to govern the movements of a virtual camera. Problems still arise with handheld or Steadicam shots and with the use of zoom lenses, since focal length is crucial for reproducing the shot. Conforming such difficult elements remains a highly skilled artisanal task.

Creating artificial space has evolved from the nineteenth-century melodramatic stage, where elaborate moving sets were used to create the illusion of larger vistas than the theater could hold. Developing from these theatrical traditions, Georges Méliès (1861–1938) first used hanging drops behind the action, and cut-out fore-grounds and sidings to create the illusion of depth in his Star Pictures productions of the early 1900s. Drops, however, lacked the light responses that a less "stagey" taste demanded (although many directors retained a taste for them, notably Federico Fellini in such later films as E la nave va [And the Ship Sails On, 1983] and Il Casanova di Fellini [Fellini's Casanova, 1976]). In their stead was developed the technique of matte painting, traditionally executed on glass sheets that could be placed in relation to live action in such a way the glass would appear to the camera as a natural continuation of the real space. One of the most celebrated examples of the technique was used to create Tara in Gone with the Wind (1939). Matte paintings are still used, often in the form of cycloramas ("cycs"), large semicircular drop curtains painted with pigments responsive to the lighting and film stock used for a shot, often composed of tiled photographs of real locations treated to add features, remove unwanted elements, or smooth over transitions from tile to tile. Cruder photocopied cycs are used to provide reflections of the virtual landscape onto real sets and actors.

In contemporary cinema, mattes are frequently replaced with blue- or greenscreen cycs against which the actors perform. Earlier versions of this technology filmed actors against an intensely lit blue or yellow backdrop through a beam-splitting prism inside the camera, which directed one stream of light to a strip that received only blue or yellow light, while the other received everything but, thus creating a perfect traveling matte. The colors of contemporary cycs are likewise reference colors that can be simply subtracted from the photographic plate (the term used to describe an element used in compositing different versions of a scene into a single image) and replaced with a digital matte, itself frequently composed of tiled photographic elements.

RAY HARRYHAUSEN
b. Los Angeles, California, 29 June 1920

An American model animation and special effects expert, Ray Harryhausen provided the visual effects for many science fiction and fantasy films. Harryhausen's work was characterized by a combination of anatomical authenticity and creative fantasy, whether he was animating actual animals (the dinosaurs of One Million Years B.C., 1966) or imaginary beasts (the Venusian Ymir of 20 Million Miles to Earth, 1957).

As a young man Harryhausen was interested in sculpture and palaeontology, both of which would give his later animated work its distinctive verisimilitude. Harryhausen was impressed by Willis O'Brien's stop-motion animation for the original King Kong (1933), which inspired him to experiment with a variety of animation techniques himself. He showed his work, which he had produced in the family garage, to O'Brien, who hired Harryhausen as his assistant for Mighty Joe Young (1949), another ape movie. Harryhausen immediately established his careful working methods by sending a motion picture cameraman to a zoo to photograph one of the gorillas, using the footage to help give the film's animated ape an impressive array of individualized gestures.

After working briefly for George Pal's Puppetoon series, Harryhausen contributed some of the animated effects for Frank Capra's Why We Fight films of the 1940s. Independently, Harryhausen produced a series of short animated fairy tales (e.g., Little Red Riding Hood, 1949, and Hansel and Gretel, 1951), and in 1953 he provided the special effects for one of the best dinosaur monster movies, The Beast from 20,000 Fathoms (1953), the first feature for which he was in charge of visual effects. The movie features a giant rhedosaurus, disturbed by atomic testing, who wreaks havoc on New York City. While working on Beast, a relatively low-budget movie, Harryhausen began exploring more resourceful ways of combining animated models with live backgrounds.

In Jason and the Argonauts (1963), Harryhausen developed the process he called Dynamization, which incorporates matte photography, sets built to scale, and the synchronization of animated and live-action photography. The film boasts some of Harryhausen's best work, including the justly famous sword fight between Jason and his men and seven skeletons, a sequence that alone took four and a half months to produce.

Harryhausen's work on It Came from Beneath the Sea (1955), about a giant octopus that attacks San Francisco, marked the beginning of a fruitful business relationship with producer Charles H. Schneer, which lasted for seventeen years and resulted in many films. Though some of Harryhausen's later work was more hurried and looks comparatively crude, it is important to keep in mind that he was working in the pre-digital era.

RECOMMENDED VIEWING

King Kong (1933), Mighty Joe Young (1949), The Beast from 20,000 Fathoms (1953), Earth vs. the Flying Saucers (1956), 20 Million Miles to Earth (1957), The Seventh Voyage of Sinbad (1959), Jason and the Argonauts (1963)

FURTHER READING

Harryhausen, Ray. Film Fantasy Scrapbook. New York: A. S. Barnes, London: Tantivy Press, 1972.

Harryhausen, Ray, and Tony Dalton. The Art of Ray Harryhausen. London: Aurum Press, 2005.

Barry Keith Grant

This technique is especially effective in cases where directors would previously have used rear projection to provide a moving matte effect. Rear projection demanded rigorous synchronization of the rear projector with the camera, and produced substantial difficulties in matching the focal length of the camera recording the actors with the depth of the scene rear-projected, an effect visible in a number of Alfred Hitchcock films, among them the driving scene in Notorious (1946). Typically, recent films use a combination of older and

newer effects. The jet-bike chase through the forest in Star Wars: Episode VI—Return of the Jedi (1983), for example, uses a traveling matte, in which an under cranked Steadicam race through a forest location was matched with a rotoscoped matte into which the actors, filmed against bluescreen, could be slotted onto the same strip of film without recourse to digital editing. Rotoscoping refers to the traditional animation technique of tracing the outlines of photographed action, frame by frame, to produce moving silhouettes, a technique now partly automated in digital editing software.

Other physical effects used since the very early days of cinema include filters, such as day-for-night, which cut down the ambient daylight to emulate moonlight, and dry-for-wet, especially useful when actors are required to produce emotional performances during underwater sequences. Scale effects such as the forced perspective used to produce the city square in Sunrise: A Song of Two Humans (F. W. Murnau, 1927) remain significant, as in the use of real lizards in Journey to the Center of the Earth (1959). Fantastic landscapes can be created by shooting small objects such as pebbles to make them appear the size of boulders, an effect used extensively in The Incredible Shrinking Man (1957), while its obverse appears in Attack of the Fifty-Foot Woman (1958).

Equally theatrical in origin is the use of makeup, prosthetics, and wigs, though again with the tendency to seek credibility rather than emotional effect. However, much of the more flamboyant use of these techniques—from Fredric March's transformation scene in Dr. Jekyll and Mr. Hyde (1931) to Jim Carrey's turn in Lemony Snicket's A Series of Unfortunate Events (2004), by way of John Carpenter's creature cycle of the 1980s and Tim Burton's Beetlejuice (1988)—tend to belong to the guignol tradition of the late nineteenth-century stage, a lineage that has inspired such masters of horror effects and makeup as Tom Savini (b. 1946) and Rob Bottin (b. 1959). Other stage-adapted techniques include the use of partial mirrors and reflections through glass plates held at a 45-degree angle to the camera, for such effects as ghosts or actors being consumed by flames that are actually several feet away but are reflected from the surface of the glass.

Other recent techniques deserving mention under the rubric of physical effects are bullet-time, motion capture, and digital scanning. Bullet-time, associated with effects supervisor John Gaeta's (b. 1965) work on The Matrix (1999), uses an array of still cameras timed by computer to construct an image of a single action viewed from multiple viewpoints in quick succession, giving the effect of freezing the action, while a single virtual camera travels around it. Motion capture, which revives techniques developed by the chronophotographer Étienne-Jules Marey in the 1880s, studs a performer's body or face with tiny reflectors. Instead of recording the visible light, motion capture uses infrared or other wavelengths to track the movement of these reflectors through three-dimensional space. The data so captured can then be applied to a digital double, or distorted to provide movements for an imaginary character. Digital scanning deploys a device rather like a barcode scanner on both objects and people to produce detailed three-dimensional geometry and surface maps, which can then be reworked in digital tools. Scans are used, for example, to scale up or down from models built by effects departments, rendering small sculptures as large edifices and vice versa. The technology is also used to scan actors emoting onto digital doubles engaged in impossible stunts rendered in digital spaces. Such scans were used, for example, to provide key frames for the animation of Gollum's face in some sequences of The Lord of the Rings (2001–2003), and to map Ian McKellen's face onto a digitized Gandalf in the sequence showing his fall from the bridge of Khazad-Dûm in the same film.

Like motion control technology, motion capture ("mo-cap") and digital scanning share a relationship with physical reality which is as close as that of photography. Photography and cinematography rely on reflected light in the visible spectrum to construct two-dimensional images. Mo-cap and scanning take nonvisible light to construct three-dimensional images. Like the technique of taking molds from physical surfaces and applying them to miniatures and set construction, or using life-masks taken from performers as the basis for prosthetic makeup, the relationship with the surfaces of the sampled reality is in many instances more accurate than that gathered by traditional cinematography.

It is important to note that many effects are available for low-budget film production, and many make innovative use of them. In AMY! (Laura Mulvey and Peter Wollen, 1979), what appears to be a full-sized chest of drawers reveals itself to be doll's house furniture. Double Indemnity Performed by the Japanese-American Toy Theatre of London is a 1970s video production enacted entirely by plastic wind-up toys. Spurts of fake blood are the hardy standby of many student films. Second-hand stores have provided props, costumes, and prosthetics for films as disparate as Peter Jackson's Bad Taste (1987) and The Lord of the Rings.

OPTICAL EFFECTS

Many optical effects are produced in camera, among them irising in and irising out (an effect that relies on literally manipulating the camera's iris, a technique already well established when Billy Bitzer (1872–1944) shot Broken Blossoms for Griffith in 1919 and blanking out areas of the field of view to emulate binoculars, telescopes, keyholes, gun sights, and similar shapes. Double exposure can be achieved in camera as well as in postproduction, by the simple expedient of rewinding the film and shooting over it again.

Many more effects relied on the optical printer, a device used to print from the master negative to the positive for editing. Dissolves from one shot to another and fades to black, for example, could be achieved by running two strips of negative through the printer simultaneously. Passing a matte (in this case a thin sheet of opaque material) across the interface of the two filmstrips, exposing first one area and then the area previously masked by the matte, produced wipes, whose variety can be best seen displayed in RKO's Flying Down to Rio (1933). Different areas of the filmstrip can be printed with different images, a technique used extensively in the documentary Woodstock (1970). Crucially, optical printing can be used to match shots from disparate sources: for example, a landscape with characters reacting matched with a sky filled with billowing clouds (produced by spilling specially mixed pigments into a tank of translucent oil) for the arrival of the aliens in Independence Day (1996). The optical printer was also a crucial device in titling, where the lettering was filmed separately on a rostrum, and then printed over the photographic plate. Likewise, optical printing provided the base for such innovations as the mixture of cartoon with rotoscoped live action in Ub Iwerks's (1901–1971) early Alice animations, such as Alice the Toreador (1925), Alice Rattled by Rats (1925), and Alice the Whaler (1927).

Indeed, animation has remained a consistent source of effects within live action cinema, including such landmarks of animation as the city of the Krell in Forbidden Planet (1956) and the painterly effects of Waking Life (2001). The full integration of animation techniques into features had to wait, however, for the development of three-dimensional digital animation. Pioneer attempts like Disney's Tron (1982) and the genesis effect in Star Trek: The Wrath of Khan (1982) intimated what might be possible. The financial success of the first Star Wars (1977) indicated what could be achieved with almost exclusively analogue effects. By 1988, Industrial Light and Magic, the effects shop established by George Lucas to work on Willow (released that year, the film in which he pioneered the digital morph), would provide over a thousand shots for Robert Zemeckis's Who Framed Roger Rabbit? (also released that year). Certain techniques have remained fairly constant, notably the use of key frame animation to establish the most important moments (frequently the beginning and end) of an animated gesture. Others were the fruit of laborious research, such as the problem of soft objects (which explains the preponderance of billiard balls in early digital animation) and z -buffering (getting objects to touch without penetrating each other on the z or depth axis of the image, as opposed to the x and y axes of two-dimensional images). Celebrated in early examples such as the watery pseudopod in James Cameron's (b. 1954) The Abyss (1989), digital animation swiftly reached for less self-conscious, more embedded functions in movies, achieving a notable success in Cameron's Titanic in 1997, where the distinctions between set, model, and animation were all but invisible to contemporary audiences.

Early vector animation composed creations out of algebraic descriptions of curves. The popular NURBS (Non-Uniform Rational B-Splines) uses such vectors to define sections of the surface of a creature rendered initially in wire frame view, a lattice of interconnecting lines. The areas bounded by these lines (polygons) can be programmed to relate to neighboring polygons, so that if one stretches, another may contract to make up for the move. More recently, animators have moved toward subdivision modeling, in which a crude figure is gradually refined by adding and subtracting polygons to provide detail. Industry wisdom has it that "reality begins at 1 million polygons," a mathematical response to the idea that a typical frame of 35mm film has approximately that many grains of silver compounds. Wire frame was for some years the basic view designers had during production, since the frames required relatively little processing time. Once the movements were approved, the frames would have surfaces applied to them. These may be generated digitally, typically by the process of ray-tracing, which allows for both surface color and texture and for different lighting conditions. Alternatively, they may have a "skin" applied, a surface texture derived from photography, as in the case of the digital Harrier jumpjet in True Lies (1994). Especially for close-up shots, animators will frequently add bitmap effects, such as the paint effects available in Adobe Photoshop, to add extra detail or to provide digital "dirt." One attraction of three-dimensional modeling is that once built, a creature can be reused numerous times. A three-dimensional model is a dataset, and can be recycled not only in films but, for example, as a Computer-Aided Design and Manufacture (CADCAM) file, as was the case with the Buzz Lightyear character in Toy Story (1995), subsequently mass produced as a toy.

Individually handcrafted creatures may be too time-consuming, expensive, or processor-heavy for larger scale projects. Disney's The Lion King (1994) used a technique developed in scientific computing to analyze flocking behavior in order to animate the wildebeest stampede. Each wildebeest was given a small list of behaviors that it applied repeatedly, such as "run in the same direction as the others" and "always try to get to the inside of the group." Referred to as recursive (to describe the complex behavior emerging from the repeated application of a small rule set), this basic artificial life technology allowed the wildebeest effectively to animate themselves. Similar techniques have been used with larger numbers of "agents" with a broader range of behaviors in Disney's follow-up The Hunchback of Notre Dame (1996) for carnival crowds including a hundred or so different characters, each with a special attribute such as juggling, dancing, or carousing. Massive (Multiple Agent Simulation System in Virtual Environment), developed for The Lord of the Rings trilogy, extends these principles significantly. Massive uses motion-capture elements to provide its agents with vocabularies of up to two hundred movements. Each agent has collision-detection, and each emits a signal allowing other agents to identify whether it is friend or foe. Controls allow animators to increase or diminish the amount of "aggression" at any moment, triggering a fight or a riot. Otherwise, the agents are allowed to direct their own actions, guided by tracking algorithms that direct them toward a particular goal, such as a pass through a valley. Agents are animated at one of three levels, according to their size relative to the camera, with maximum detailing applied with subdivision modelling only to those closest. Many Massive agents are

RICHARD TAYLOR
b. Richard Leslie Taylor, Cheshire, England, 8 February 1965

With Oscars® for special makeup effects (2002, 2004), costume (2003, 2004) and visual effects (2002), the critical and popular success of The Lord of the Rings trilogy is to date the high point of Richard Taylor's career. Perhaps the first films planned from the start for DVD release, the trilogy privileged the detailed attention to props, sets, and makeup that characterizes Taylor's work as the cofounder and artistic director of Weta, the firm that coordinated the production effects for the trilogy.

Founded as RT Effects in 1987 by Taylor and long-time partner Tania Rodger, the small model-making and effects studio was relaunched in partnership with director Peter Jackson and producer and editor Jamie Selkirk to service advertising, film, and television. Though closely associated with Jackson's early horror genre pieces, Taylor made his first major international impression with effects for Peter Jackson's splatter epic Braindead (1992) and the TV series Xena and Hercules, both produced by Sam Raimi and shot in New Zealand, where the company is based.

Taylor's work is characterized by the extensive use of physical elements, perhaps most unusually the extensive use of miniatures, notably Saruman's subterranean factory and the city of Gondor in Lord of the Rings. Taylor honed his skills on caricature puppets for a TV satire show, on the lubricious monsters of Jackson's Meet the Feebles (1989) and the incompetent ghosts of The Frighteners (1996). Something of that humor remains in the puppetry and animatronics featured in Taylor's work ever since, as the craft developed from the cartoonish work of Jim Henson's Creature Shop toward the photorealism of Weta's oliphaunts. For Lord of the Rings the animatronics were supplemented with digital scans of models, which could then be composited with three-dimensional elements, adding a new range of dynamics fusing sculptural with filmic movement. The hybrid physical-digital environment of twenty-first-century effects owes a significant debt to Taylor's innovations.

Art house credits for Once Were Warriors (1994) and Heavenly Creatures (1994) may have helped secure work on Master and Commander: The Far Side of the World (2003), to which Taylor contributed stunning model work on the eighteenth-century sailing ships, and on The Last Samurai (2003), for which Weta supplied the military weapons, which had become such a feature of The Lord of the Rings. The ability to build environments articulating an entire way of life extends to the meticulously detailed Edoras and Rivendell miniatures for The Lord of the Rings.

Jackson's King Kong (2005) and Andrew Adamson's Chronicles of Narnia (2005), both Weta projects, demonstrate that the invention continues, marked respectively by the legacies of Willis O'Brien and Ray Harryhausen. Now supplemented by Weta Digital, Weta Workshop's broadband satellite links connect the masters of the past to the globalized future of effects.

RECOMMENDED VIEWING

Meet the Feebles (1989), Braindead (1992), Heavenly Creatures (1994), The Lord of the Rings (2001–2003), Master and Commander: The Far Side of the World (2003), The Chronicles of Narnia: The Lion, the Witch and the Wardrobe (2005), King Kong (2005), The Legend of Zorro (2005)

FURTHER READING

Taylor, Richard. The Lord of the Rings: Creatures. Boston: Houghton Mifflin, 2002.

——. "Taylor-Made: At Long Last, an OnFilm Interview with Oscar®-winner Richard Taylor of Weta Workshop." OnFilm, December 2002: 15.

Sean Cubitt

entirely digital, but many, such as the animated horses attacking the "oliphaunts" in The Lord of the Rings: The Return of the King, also use photographic elements, while others, such as many of the "hero" (close-to-camera) "orcs" were given features derived from digital scans of performers in prosthetic makeup and full costume. To cut render times for sequences employing up to a hundred thousand agents, the Massive renderer begins with the agents closest to the screen, so that only those visible behind that agent need to be rendered at all, although the others are still in some sense visible to the program, which tracks their movements while they are obscured from the virtual lens.

Certain aspects of digital postproduction still pose challenges. The most familiar elements of the world, including eyes and skin, are considered the most difficult to render successfully. The most complex and successful experiments on skin tone include subsurface refraction of light, using complex three-dimensional models with not only skin but blood vessels, muscles, and bones. Major three-dimensional models are articulated on virtual skeletons, with virtual muscles, and with algorithms governing the sliding of skin over muscle and bone. Eyes, so deeply associated with emotion, must also be given great depth by the use of layers of animation, each of which responds differently to virtual light. Such effects must then be matched with the live-action lighting conditions, with movement in the lit environment as well as their angle to the camera, and in relation to anything in the environment that might be reflected in their eyes. One solution to the problems posed by lesser challenges like water and fire is the use of sprites, practical elements, some filmed on location (like the stormy seas of Master and Commander: The Far Side of the World) and others created in studios, applied to three-dimensional geometry. In analogue days, such effects might be achieved in optical printers (a flamethrower shot was passed through the optical printer fifty times to provide the burning skies of Voyage to the Bottom of the Sea, 1961). Such sprites may then "track" other digital or photographic elements through software that instructs, for example, the sprite of a boat's wake to follow the boat, as in Troy (2004).

Other aspects are automations or more effective variants of traditional techniques. Editors have long been responsible for brushing out unwanted elements in a shot, either literally painting them out or using garbage mattes to hide them, replacing the matted area with a "beauty pass," a clean plate of the location without actors or equipment. These processes are now done digitally. The process of grading, during which photographic laboratories print the edited film to changing specifications in order to match the light and color responses, has also been overtaken by digital grading, a technology that, however, allows far more than supporting the use of filters for day-for-night shooting. Digital grading can be used to apply a color palette to an entire movie or sequence, and can be applied differentially to different areas of the image. This tool is useful not only for balancing exposures in scenes where one area is brightly lit and another in shadow, nor simply for highlighting detail in an actor's face; it is an essential tool for combining plates from disparate sources, especially when compositing may involve as many as fifty plates in a single frame.

Motion control files are extremely significant at this juncture, as is information on the types of lens used. Digital mattes, unlike their physical correlates, need to provide three-dimensional information if there is any camera movement, where a move would reveal another facet of the backdrop. A sky applied to a sequence may derive from "scenic" location shoots or be painted, but it must match the lighting on all the other plates—for example, casting cloud shadows or opening into brilliant sunshine on cue. The crisp detail of digital animations may need to have motion blur applied to make it more credible as the photographed object of a camera lens, and even such accidental artifacts as lens flares (an effect of sunlight bouncing inside the refracting elements of an actual camera lens) are often added digitally to give a

greater sense of the presence of a real camera on the virtual or hybrid set. Pyrotechnic effects may be scaled to match the scene, in which case the effects of their light on the immediate environment needs to be considered. Animatronics, water effects (sometimes shot at speeds over a hundred frames per second), puppets, digital effects, miniatures, and live action, many of them shot in multiple passes under different lights, must be blended together as seamlessly as possible. Excessive detailing may need to be toned down to produce a more coherent plane of vision, while providing for the effects of scale and of the interaction between layers. When major film projects may take two to three years to develop from storyboard (often digital animatic) to release, the problem of infinite "tweakability" enters, not least since each change to the master edit requires a change to scoring and sound effects, whose synchronization with the image must be perfect to convince an audience of its authenticity. Not surprisingly, the digital storage for feature films is now measured in terabytes.

THEORETICAL CONSIDERATIONS

In classical film theory, only Béla Balasz (1884–1949) pronounced full enthusiasm for fantasy as a potential route for cinema. Though Sergei Eisenstein (1898–1948) was a consummate technician, and a great admirer of Disney, he, like André Bazin and Siegfried Kracauer, was committed to the idea of cinema as a realist vehicle in the purest sense. However, as Christian Metz once observed, "to some extent, all cinema is a special effect," and even classics of the realist canon, such as Citizen Kane (1941), have used the full range of physical and optical effects. More recent critics, following the lead of sociologist Jean Baudrillard, have complained (or rejoiced) that with special effects, cinema departs from the depiction of the world in order to produce a form of hyperreality whose social purpose is to point toward the unreality of the world of everyday experience.

Scholars reflecting on special effects, especially in the period since digital media made their biggest impact on movie production and postproduction, have derived much of their inspiration from phenomenology, following the lead of pioneer analyst Vivian Sobchack. In her work on science fiction film, Sobchack points especially to the construction of space—as a dimension as well as a place beyond the atmosphere—as a critical achievement. Michelle Pierson provides a detailed account of what she considers the crucial transition from the "wonder years" of the 1980s, when films like Terminator 2 (1991) fore-grounded their effects wizardry, to the 1990s, when effects became much more a tool for the production of familiar verisimilitude. Norman Klein and Angela Ndalianis emphasize the parallels between the postmodern culture of special effects and the baroque period of the counterreformation, with its use of spectacle and illusion as a means to win propaganda wars. Taking a more culturally oriented approach, Scott Bukatman stresses the interplay between such themes as superhuman capabilities and cultural trends; like Klein and Ndalianis, Bukatman is interested in the connections between special effects cinema, theme parks, and such phenomena as Las Vegas casino hotels, some forms of sports, immersive technologies like virtual reality, and such related popular cultural forms as graphic novels and computer games. Urbanist and cultural commentator Paul Virilio includes special effects among the optical technologies with which he credits the acceleration of society, to the point of its disappearance. Vilém Flusser's preliminary work on digital photography, meanwhile, suggests that the apparatus of visual technologies exists to exhaust all possibilities, reducing humans to mere functionaries of that process. Between the annihilation of reality and the affirmation of the phenomena of human experience, the study of special effects, though nascent, is already beginning to alter our preconceptions of the nature and purpose of film.

SEE ALSO Animation;Camera;Cinematography;Crew;Makeup;Postmodernism;Production Process;Technology

FURTHER READING

Brosnan, John. Movie Magic: The Story of Special Effects in the Cinema. London: Abacus, 1977.

Bukatman, Scott. Matters of Gravity: Special Effects and Supermen in the 20th Century. Durham, NC: Duke University Press, 2003.

Klein, Norman M. The Vatican to Vegas: A History of Special Effects. New York: New Press, 2004.

Ndalianis, Angela. Neo-Baroque Aesthetics and Contemporary Entertainment. Cambridge, MA: MIT Press, 2004.

Pierson, Michele. Special Effects: Still in Search of Wonder. New York: Columbia University Press, 2002.

Pinteau, Pascal. Special Effects: An Oral History, Interviews with 37 Masters Spanning 100 Years. New York: Harry N. Abrams, 2004.

Sobchak, Vivian. Screening Space: The American Science Fiction Film. New York: Ungar, 1987.

——, ed. Meta-morphing: Visual Transformation and the Culture of Quick-Change. Minneapolis: University of Minnesota Press, 2000.

Sean Cubitt

Special Effects

views updated May 29 2018

SPECIAL EFFECTS

Special effects (which typically refers to visual effects in live-action moving-image media but also includes audio effects and other possibilities) are the methods used to produce on-screen (or on-air) events and objects that are physically impossible or imaginary, or too expensive, too difficult, too time-consuming, or too dangerous to produce without artifice. The ethics of the related technologies are seldom discussed but are nevertheless significant.


Origins

Cinematic special effects grew out of trick photography and began with the trick film tradition popularized by early filmmakers such as Georges Méliès (1861–1938), a special effects pioneer who was the first to develop many in-camera techniques. Silent films used a variety of special effects techniques, particularly in the genres of science fiction and horror. Many new special effects technologies became possible after the invention of the optical printer in 1944, resulting in a new generation of science-fiction films in the 1950s that used the new techniques, as well as more realistic-looking effects in other films. Finally, the late 1980s and 1990s saw another advance in effects technology: the rise of digital special effects created in computers, which allowed live-action footage to be combined with anything that could be rendered in computer graphics.

Special effects are a large part of the film industry in the early twenty-first century, with a number of companies such as Industrial Light & Magic and Digital Domain specializing in the production of special effects. Special effects can be found in almost every genre of filmmaking, in both big-budget and low-budget films, as well as on television, most notably in advertising, where high budgets and short formats allow filmmakers to experiment with expensive new techniques.


Types of Special Effects

Special effects can be divided into four types: practical effects, in-camera effects, optical effects, and digital effects. Practical effects, also known as physical effects, are those that occur in front of the camera, such as rigged explosions, pyrotechnics, animatronics figures or puppetry, makeup effects, and so forth. Practical effects have the advantage of occurring on the set where they appear directly in the scene and the action of the shot, and require no postproduction processes.

In-camera effects are achieved through forms of trick photography and are made in the camera at the time of shooting. Such effects include shots taken at different camera speeds, shots using lens filters, and day-for-night shooting, all of which change the kind of image being recorded. Superimpositions and multiple-exposure matte shots require the film to be exposed, rewound, and exposed again, adding two or more images together onto the same piece of film before it is developed (this combining of imagery is also called compositing). Foreground miniatures, glass shots, and matte paintings make use of the monocular nature of the camera by falsifying perspective and making small objects close to the camera look as if they are part of larger objects farther away from the camera. Buildings can be extended and other large set pieces can thus be made inexpensively through the use of detailed models and paintings done with the correct perspective. Front projection and rear projection processes combine foreground sets and actors with backgrounds made from projected imagery (most typically as moving background imagery placed behind an actor driving a car).

Optical effects involve the use of an optical printer, a device invented by Linwood Dunn in 1944 that allows images on developed pieces of film to be rephotographed and composited together onto a single piece of film. An optical printer is basically a camera and a projector (or multiple projectors, in some cases) set up with a camera in such a way that film frames can be rephotographed directly from another strip of film. Optical processes allow frame-by-frame control and greater precision in spatially positioning elements than is possible with in-camera compositing. Perhaps the most common form of optical compositing is the matte shot, wherein a foreground element is combined with a background, without the background visible through the foreground element (as would be the case with superimposition). To achieve this, keying processes are used for the production of foreground elements, and the most typical of these, blue screening and green screening, place the actor in a solid-color background, which is later optically removed from the shot. A holdout matte is made from the foreground element, which leaves a part of the rephotographed background plate unexposed, and the foreground element is later exposed onto the same plate, fitting into the unexposed area. Traveling mattes also make this technique possible for moving objects and moving camera shots.

Digital effects are all done in a computer. Images are either shot with digital cameras or scanned from film into a computer, where they are edited and composited digitally. Digital effects avoid the generational loss (the loss that occurs when film images are rephotographed onto another piece of film) that happens during optical rephotography, and the computer makes matteing much easier and faster and gives the effects technician greater control over the image. Digital effects technology also allows computer-generated imagery to be combined with live-action footage, and allow images to be controlled down to individual pixels. Light, shadow, and color can all be adjusted, and digital grading can replace color correction and matching that was previously done during the color timing (the matching of colors from shot to shot during postproduction) of prints in postproduction. Digital effects were experimented with during the 1980s and came into common use during the mid-1990s as techniques were developed and computer systems became powerful enough to make digital effects work affordable.

Some special effects (such as dinosaurs, space battles, monsters, and so forth) are obviously special effects no matter how well they are done, because the objects or events they portray clearly do not or no longer exist. Other effects, known as "invisible effects," are less noticeable because they portray objects and events (for example, background buildings, smoke, and building extensions) that do not call attention to themselves and that usually could have been done conventionally had the budget allowed it. Another type of invisible effects are effects in which something is erased or removed from the image. One example is wire removal, in which the wires used to fly an actor or object are digitally erased during postproduction.


Ethics

The alteration and faking of photographs has existed as long as photography itself. Whether or not the use of special effects is ethical depends on the intentions and truth claims of the work in which they appear. By altering, combining, or fabricating images, special effects work reduces or removes the correspondence, or indexical linkage, that an image may have to its real-world referent. Thus, while special effects may be acceptable in films that are fictional or are clearly re-creations of events, one would not expect to find them in news or documentary footage that claims to be a record of actual events. Even when they are used in an entirely fictional film, how special effects are used can still greatly determine how a film is received by an audience. For example, Jackie Chan's earlier films, in which he actually does all his own stunts, are more impressive than his later films in which some of his stunts are the result of wire work and special effects. Likewise, while the digital crowd scenes in The Lord of the Rings: The Two Towers (2002) and Star Wars, Episode II: Attack of the Clones (2002) are impressive, one is still aware that they are special effects, unlike the massive crowd scenes in older movies such as Gandhi (1982) and the Russian version of War and Peace (1966–1967), which were all done using actual crowds. At the same time, not only are special effects used to create spectacle, but their creation itself has become a spectacle, as witnessed by "making of" featurettes often found among the DVD extras. For many, knowing how an effect was made can enhance the viewing experience rather than spoil the effect.

Advances in special effects have made fantastic ideas possible and allowed filmmakers to give them concrete expression. The fact that many effects in the early twenty-first century are photo-realistic and seamlessly integrated into live-action footage also means that a discerning viewer will need a certain degree of sophistication. Combined with unlikely storylines, the use of special effects, which makes unlikely or impossible events appear possible and plausible, may help to erode the ability of younger or unsophisticated viewers to distinguish between what is plausible and what is not. Despite the fact that the films in which special effects appear are often clearly fictional, seeing photo-realistic representations of what look like actual events can make an impression on some viewers, particularly in a culture in which so much of what people see of the world is mediated through film and television imagery. At the same time, because of magazines, books, and DVD extras detailing special effects techniques and technology, contemporary viewers often are more aware of how special effects are done and how they are incorporated into a film.

MARK J. P. WOLF

SEE ALSO Computer Ethics;Entertainment;Movies;Video Games.

BIBLIOGRAPHY

McAlister, Michael J. (1993). The Language of Visual Effects. Los Angeles: Lone Eagle Publishing. A dictionary of special effects terminology.

Rickitt, Richard. (2000). Special Effects: The History and Technique. New York: Watson-Guptill. A good overview of the history and techniques of special effects.

Vaz, Mark Cotta, and Craig Barron. (2002). The Invisible Art: The Legends of Movie Matte Painting. San Francisco: Chronicle Books. A book on a specific effect, matte paintings, which are the background images used in wide shots and combined with live action.

Wolf, Mark J. P. (2000). "Indexicality." In Abstracting Reality: Art, Communication, and Cognition in the Digital Age. Lanham, MA: University Press of America. A look at how digital technology has changed art, communication, and how people view their world and environment.

Special Effects

views updated May 23 2018

Special Effects ★★ 1985 (R)

A desperate movie director murders a young actress, then makes a movie about her death. Solid, creepy premise sinks in the mire of flawed execution; a good film about Hollywood ego trips and obsession is lurking inside overdone script. 103m/C VHS, DVD . GB Zoe Tamerlis, Eric Bogosian, Kevin J. O'Connor, Brad Rijn, Bill Oland, Richard Greene; D: Larry Cohen; W: Larry Cohen.

special effects

views updated May 18 2018

spe·cial ef·fects • pl. n. illusions created for movies and television by props, camerawork, computer graphics, etc.

More From encyclopedia.com