Extreme Influence – Thought Reform, High Control Groups, Interrogation and Recovered Memory Psychotherapy
EXTREME INFLUENCE – THOUGHT REFORM, HIGH CONTROL GROUPS, INTERROGATION AND RECOVERED MEMORY PSYCHOTHERAPY
That social influence is ever present and pervasive is one of the most fundamental observations that may be made about social life. While we are both guided and constrained by the influence factors present at every moment in our lives, we typically fail to fully appreciate how much of an effect cultural, community, and interpersonal influences have on our values, beliefs, and choices. This is in part due to the fact that we rarely find ourselves in situations in which essentially all of the influence forces to which we are exposed are strongly organized and directed in support of a particular ideology, perception, and/or set of actions—rather we usually find ourselves in social situations in which the factors pressuring us to act in one way are offset somewhat by opposing pressures directing us toward different value positions, understandings of the world, and/or different actions. The variability and diversity of the influences to which we are exposed allow us to maintain a sense that we are the source of our choices and actions and contribute to our sense of personal autonomy, freedom of choice, and individuality.
The substantial power of community and interpersonal influence to shape perceptions and actions can most clearly be appreciated through the study of social environments in which influence factors are hyper-organized rather than relatively loosely knit together. It is through the study of influence environments that have been consciously designed to elicit conformity, promote radical change in a person's values and beliefs, and influence an individual's choices that it is possible to gauge the extent to which our sense of personal autonomy and individuality is rooted in social organization. Four examples of environments constructed for the purpose of inducing radical and dramatic shifts in major components of persons' values or perceptions, or both, or environments designed to induce a shift on a single, but nevertheless important decision will be briefly reviewed herein—programs of thought reform or coercive persuasion (such as programs of political reeducation attempted in China under Mao); programs of indoctrination carried out by ideologically focused high control groups (groups labeled as cults and organizations that market experiences that are alleged to psychologically transform an individual); modern police interrogation tactics that, through psychological means, elicit false confessions from innocent suspects; and quack psychotherapy treatments in which patients are led to mislabel hypnotic fantasies, dreams, and hunches as "recovered memories" of lengthy histories of sexual abuse that were entirely unknown to them prior to entering treatment.
REEDUCATION PROGRAMS
Coercive persuasion and thought reform are alternate names for programs of social influence capable of producing substantial behavior and attitude change through the use of coercive tactics, persuasion, and/or interpersonal and group-based influence manipulations (Schein 1961; Lifton 1961). Such programs have also been labeled "brainwashing" (Hunter 1951), a term more often used in the media than in scientific literature. However identified, these programs are distinguishable from other elaborate attempts to influence behavior and attitudes, to socialize, and to accomplish social control. Their distinguishing features are their totalistic qualities (Lifton 1961), the types of influence procedures they employ, and the organization of these procedures into three distinctive subphases of the overall process (Schein 1961; Ofshe and Singer 1986). The key factors that distinguish coercive persuasion from other training and socialization schemes are (1) the reliance on intense interpersonal and psychological attack to destabilize an individual's sense of self to promote compliance, (2) the use of an organized peer group, (3) applying interpersonal pressure to promote conformity, and (4) the manipulation of the totality of the person's social environment to stabilize behavior once modified.
Thought-reform programs can vary considerably in their construction. The first systems studied ranged from those in which confinement and physical assault were employed (Schein 1956; Lifton 1954; Lifton 1961, pp. 19–85) to applications that were carried out under nonconfined conditions, in which nonphysical coercion substituted for assault (Lifton 1961, pp. 242–273; Schein 1961, pp. 290–298). The individuals to whom these influence programs were applied were in some cases unwilling subjects (prisoner populations) and in other cases volunteers who sought to participate in what they believed might be a career-beneficial, educational experience (Lifton 1961, p. 248).
Significant differences existed between the social environments and the control mechanisms employed in the two types of programs initially studied. Their similarities, however, are of more importance in understanding their ability to influence behavior and beliefs than are their differences. They shared the utilization of coercive persuasion's key effective-influence mechanisms: a focused attack on the stability of a person's sense of self; reliance on peer group interaction; the development of interpersonal bonds between targets and their controllers and peers; and an ability to control communication among participants. Edgar Schein captured the essential similarity between the types of programs in his definition of the coercive-persuasion phenomenon. Schein noted that even for prisoners, what happened was a subjection to "unusually intense and prolonged persuasion" that they could not avoid; thus, "they were coerced into allowing themselves to be persuaded" (Schein 1961, p. 18).
Programs of both types (confined/assaultive and nonconfined/nonassaultive) cause a range of cognitive and behavioral responses. The reported cognitive responses vary from apparently rare instances, classifiable as internalized belief change (enduring change), to a frequently observed transient alteration in beliefs that appears to be situationally adaptive and, finally, to reactions of nothing less than firm intellectual resistance and hostility (Lifton 1961, pp. 117–151, 399–415; Schein 1961, pp. 157–166).
The phrase situationally adaptive belief change refers to attitude change that is not stable and is environment dependent. This type of response to the influence pressures of coercive-persuasion programs is perhaps the most surprising of the responses that have been observed. The combination of psychological assault on the self, interpersonal pressure, and the social organization of the environment creates a situation that can only be coped with by adapting and acting so as to present oneself to others in terms of the ideology supported in the environment (see below for discussion). Eliciting the desired verbal and interactive behavior sets up conditions likely to stimulate the development of attitudes consistent with and that function to rationalize new behavior in which the individual is engaging. Models of attitude change, such as the theory of Cognitive Dissonance (Festinger 1957) or Self-Perception Theory (Bem 1972), explain the tendency for consistent attitudes to develop as a consequence of behavior.
The surprising aspect of the situationally adaptive response is that the attitudes that develop are unstable. They tend to change dramatically once the person is removed from an environment that has totalistic properties and is organized to support the adaptive attitudes. Once removed from such an environment, the person is able to interact with others who permit and encourage the expression of criticisms and doubts, which were previously stifled because of the normative rules of the reform environment (Schein 1961, p. 163; Lifton 1961, pp. 87–116, 399–415; Ofshe and Singer 1986). This pattern of change, first in one direction and then the other, dramatically highlights the profound importance of social support in the explanation of attitude change and stability. This relationship has for decades been one of the principal interests in the field of social psychology.
Statements supportive of the proffered ideology that indicate adaptive attitude change during the period of the target's involvement in the reform environment and immediately following separation should not be taken as mere playacting in reaction to necessity. Targets tend to become genuinely involved in the interaction. The reform experience focuses on genuine vulnerabilities as the method for undermining self-concept: manipulating genuine feelings of guilt about past conduct; inducing the target to make public denunciations of his or her prior life as being unworthy; and carrying this forward through interaction with peers for whom the target develops strong bonds. Involvement developed in these ways prevents the target from maintaining both psychological distance or emotional independence from the experience.
The reaction pattern of persons who display adaptive attitude-change responses is not one of an immediate and easy rejection of the proffered ideology. This response would be expected if they had been faking their reactions as a conscious strategy to defend against the pressures to which they were exposed. Rather, they appear to be conflicted about the sentiments they developed and their reevaluation of these sentiments. This response has been observed in persons reformed under both confined/assaultive and nonconfined/nonassaultive conditions (Schein 1962, pp. 163–165; Lifton 1961, pp. 86–116, 400–401).
Self-concept and belief-related attitude change in response to closely controlled social environments have been observed in other organizational settings that, like reform programs, can be classified as total institutions (Goffman 1957). Thought-reform reactions also appear to be related to, but are far more extreme than, responses to the typically less-identity-assaultive and less-totalistic socialization programs carried out by organizations with central commitments to specifiable ideologies, and which undertake the training of social roles (e.g., in military academies and religious-indoctrination settings (Dornbush 1955; Hulme 1956).
The relatively rare instances in which belief changes are internalized and endure have been analyzed as attributable to the degree to which the acquired belief system and imposed peer relations function to fully resolve the identity crisis that is routinely precipitated during the first phase of the reform process (Schein 1961, p. 164; Lifton 1961, pp. 131–132, 400). Whatever the explanation for why some persons internalize the proffered ideology in response to the reform procedures, this extreme reaction should be recognized as both atypical and probably attributable to an interaction between long-standing personality traits and the mechanisms of influence utilized during the reform process.
Much of the attention to reform programs was stimulated because it was suspected that a predictable and highly effective method for profoundly changing beliefs had been designed, implemented, and was in operation. These suspicions are not supported by fact. Programs identified as thought reforming are not very effective at actually changing people's beliefs in any fashion that endures apart from an elaborate supporting social context. Evaluated only on the criterion of their ability to genuinely change beliefs, the programs have to be judged abject failures and massive wastes of effort.
The programs are, however, impressive in their ability to prepare targets for integration into and long-term participation in the organizations that operate them. Rather than assuming that individual belief change is the major goal of these programs, it is perhaps more productive to view the programs as elaborate role-training regimes. That is, as resocialization programs in which targets are being prepared to conduct themselves in a fashion appropriate for the social roles they are expected to occupy following conclusion of the training process.
If identified as training programs, it is clear that the goals of such programs are to reshape behavior and that they are organized around issues of social control important to the organizations that operate the programs. Their objectives then appear to be behavioral training of the target, which result in an ability to present self, values, aspirations, and past history in a style appropriate to the ideology of the controlling organization; to train an ability to reason in terms of the ideology; and to train a willingness to accept direction from those in authority with minimum apparent resistance. Belief changes that follow from successfully coercing or inducing the person to behave in the prescribed manner can be thought of as by-products of the training experience. As attitude-change models would predict, they arise "naturally" as a result of efforts to reshape behavior (Festinger 1957; Bem 1972).
The tactical dimension most clearly distinguishing reform processes from other sorts of training programs is the reliance on psychological coercion: procedures that generate pressure to comply as a means of escaping a punishing experience (e.g., public humiliation, sleep deprivation, guilt manipulation, etc.). Coercion differs from other influencing factors also present in thought reform, such as content-based persuasive attempts (e.g., presentation of new information, reference to authorities, etc.) or reliance on influence variables operative in all interaction (status relations, demeanor, normal assertiveness differentials, etc.). Coercion is principally utilized to gain behavioral compliance at key points and to ensure participation in activities likely to have influencing effects; that is, to engage the person in the role-training activities and in procedures likely to lead to strong emotional responses, to cognitive confusion, or to attributions to self as the source of beliefs promoted during the process.
Robert Lifton labeled the extraordinarily high degree of social control characteristic of organizations that operate reform programs as their totalistic quality (Lifton 1961). This concept refers to the mobilization of the entirety of the person's social, and often physical, environment in support of the manipulative effort. Lifton identified eight themes or properties of reform environments that contribute to their totalistic quality: (1) control of communication, (2) emotional and behavioral manipulation, (3) demands for absolute conformity to behavior prescriptions derived from the ideology, (4) obsessive demands for confession, (5) agreement that the ideology is faultless, (6) manipulation of language in which clichés substitute for analytic thought, (7) reinterpretation of human experience and emotion in terms of doctrine, and (8) classification of those not sharing the ideology as inferior and not worthy of respect (Lifton 1961, pp. 419–437, 1987).
Schein's analysis of the behavioral sequence underlying coercive persuasion separated the process into three subphases: unfreezing, change, and refreezing (Schein 1961, pp. 111–139). Phases differ in their principal goals and their admixtures of persuasive, influencing, and coercive tactics. Although others have described the process differently, their analyses are not inconsistent with Schein's three-phase breakdown (Lifton 1961; Farber, Harlow, and West 1956; Meerloo 1956; Sargent 1957; Ofshe and Singer 1986). Although Schein's terminology is adopted here, the descriptions of phase activities have been broadened to reflect later research.
Unfreezing is the first step in eliciting behavior and developing a belief system that facilitates the long-term management of a person. It consists of attempting to undercut a person's psychological basis for resisting demands for behavioral compliance to the routines and rituals of the reform program. The goals of unfreezing are to destabilize a person's sense of identity (i.e., to precipitate an identity crisis), to diminish confidence in prior social judgments, and to foster a sense of powerlessness, if not hopelessness. Successful destabilization induces a negative shift in global self-evaluations and increases uncertainty about one's values and position in society. It thereby reduces resistance to the new demands for compliance while increasing suggestibility.
Destabilization of identity is accomplished by bringing into play varying sets of manipulative techniques. The first programs to be studied utilized techniques such as repeatedly demonstrating the person's inability to control his or her own fate, the use of degradation ceremonies, attempts to induce reevaluation of the adequacy and/or propriety of prior conduct, and techniques designed to encourage the reemergence of latent feelings of guilt and emotional turmoil (Hinkle and Wolfe 1956; Lifton 1954, 1961; Schein 1956, 1961; Schein, Cooley, and Singer 1960). Contemporary programs have been observed to utilize far more psychologically sophisticated procedures to accomplish destabilization. These techniques are often adapted from the traditions of psychiatry, psychotherapy, hypnotherapy, and the human-potential movement, as well as from religious practice (Ofshe and Singer 1986; Lifton 1987; Singer and Lalich 1995).
The change phase allows the individual an opportunity to escape punishing destabilization procedures by demonstrating that he or she has learned the proffered ideology, can demonstrate an ability to interpret reality in its terms, and is willing to participate in competition with peers to demonstrate zeal through displays of commitment. In addition to study and/or formal instruction, the techniques used to facilitate learning and the skill basis that can lead to opinion change include scheduling events that have predictable influencing consequences, rewarding certain conduct, and manipulating emotions to create punishing experiences. Some of the practices designed to promote influence might include requiring the target to assume responsibility for the progress of less-advanced "students," to become the responsibility of those further along in the program, to assume the role of a teacher of the ideology, or to develop ever more refined and detailed confession statements that recast the person's former life in terms of the required ideological position. Group structure is often manipulated by making rewards or punishments for an entire peer group contingent on the performance of the weakest person, requiring the group to utilize a vocabulary appropriate to the ideology, making status and privilege changes commensurate with behavioral compliance, subjecting the target to strong criticism and humiliation from peers for lack of progress, and peer monitoring for expressions of reservations or dissent. If progress is unsatisfactory, the individual can again be subjected to the punishing destabilization procedures used during unfreezing to undermine identity, to humiliate, and to provoke feelings of shame and guilt.
Refreezing denotes an attempt to promote and reinforce behavior acceptable to the controlling organization. Satisfactory performance is rewarded with social approval, status gains, and small privileges. Part of the social structure of the environment is the norm of interpreting the target's display of the desired conduct as demonstrating the person's progress in understanding the errors of his or her former life. The combination of reinforcing approved behavior and interpreting its symbolic meaning as demonstrating the emergence of a new individual fosters the development of an environment-specific, supposedly reborn social identity. The person is encouraged to claim this identity and is rewarded for doing so.
Lengthy participation in an appropriately constructed and managed environment fosters peer relations, an interaction history, and other behavior consistent with a public identity that incorporates approved values and opinions. Promoting the development of an interaction history in which persons engage in cooperative activity with peers that is not blatantly coerced and in which they are encouraged but not forced to make verbal claims to "truly understanding the ideology and having been transformed," will tend to lead them to conclude that they hold beliefs consistent with their actions (i.e., to make attributions to self as the source of their behaviors). These reinforcement procedures can result in a significant degree of cognitive confusion and an alteration in what the person takes to be his or her beliefs and attitudes while involved in the controlled environment (Bem 1972; Ofshe et al. 1974).
Continuous use of refreezing procedures can sustain the expression of what appears to be significant attitude change for long periods of time. Maintaining compliance with a requirement that the person display behavior signifying unreserved acceptance of an imposed ideology and gaining other forms of long-term behavioral control requires continuous effort. The person must be carefully managed, monitored, and manipulated through peer pressure, the threat or use of punishment (material, social, and emotional) and through the normative rules of the community (e.g., expectations prohibiting careers independent of the organization, prohibiting formation of independent nuclear families, prohibiting accumulation of significant personal economic resources, etc.) (Whyte 1976; Ofshe 1980; Ofshe and Singer 1986).
The rate at which a once-attained level of attitude change deteriorates depends on the type of social support the person receives over time (Schein 1961 pp. 158–166; Lifton pp. 399–415). In keeping with the refreezing metaphor, even when the reform process is to some degree successful at shaping behavior and attitudes, the new shape tends to be maintained only as long as temperature is appropriately controlled.
One of the essential components of the reform process in general and of long-term refreezing in particular is monitoring and controlling the contents of communication among persons in the managed group (Lifton 1961; Schein 1960; Ofshe et al. 1974). If successfully accomplished, communication control eliminates a person's ability to safely express criticisms or to share private doubts and reservations. The result is to confer on the community the quality of being a spy system of the whole, upon the whole.
The typically observed complex of communication-controlling rules requires people to self-report critical thoughts to authorities or to make doubts known only in approved and readily managed settings (e.g., small groups or private counseling sessions). Admitting "negativity" leads to punishment or reindoctrination through procedures sometimes euphemistically termed "education" or "therapy." Individual social isolation is furthered by rules requiring peers to "help" colleagues to progress, by reporting their expressions of doubt. If it is discovered, failure to make a report is punishable, because it reflects on the low level of commitment of the person who did not "help" a colleague to make progress.
Controlling communication effectively blocks individuals from testing the appropriateness of privately held critical perceptions against the views of even their families and most-valued associates. Community norms encourage doubters to interpret lingering reservations as signs of a personal failure to comprehend the truth of the ideology; if involved with religious organizations, to interpret doubt as evidence of sinfulness or the result of demonic influences; if involved with an organization delivering a supposed psychological or medical therapy, as evidence of continuing illness and/or failure to progress in treatment.
The significance of communication control is illustrated by the collapse of a large psychotherapy organization in immediate reaction to the leadership's loss of effective control over interpersonal communication. At a meeting of several hundred of the members of this "therapeutic community" clients were allowed to openly voice privately held reservations about their treatment and exploitation. They had been subjected to abusive practices which included assault, sexual and economic exploitation, extremes of public humiliation, and others. When members discovered the extent to which their sentiments about these practices were shared by their peers they rebelled (Ayalla 1998).
Two widespread myths have developed from misreading the early studies of thought-reforming influence systems (Zablocki 1991). These studies dealt in part with their use to elicit false confessions in the Soviet Union after the 1917 revolution; from American and United Nations forces held as POWs during the Korean War; and from their application to Western missionaries held in China following Mao's revolution.
The first myth concerns the necessity and effectiveness of physical abuse in the reform process. The myth is that physical abuse is not only necessary but is the prime cause of apparent belief change. Reports about the treatment of POWs and foreign prisoners in China documented that physical abuse was present. Studies of the role of assault in the promotion of attitude change and in eliciting false confessions even from U.S. servicemen revealed, however, that it was ineffective. Belief change and compliance was more likely when physical abuse was minimal or absent (Biderman 1960). Both Schein (1961) and Lifton (1961) reported that physical abuse was a minor element in the understanding of even prison reform programs in China.
In the main, efforts at resocializing China's nationals were conducted under nonconfined/nonassaultive conditions. Millions of China's citizens underwent reform in schools, special-training centers, factories, and neighborhood groups in which physical assault was not used as a coercive technique. One such setting for which many participants actively sought admission, the "Revolutionary University," was classified by Lifton as the "hard core of the entire Chinese thought reform movement" (Lifton 1961, p. 248).
Attribution theories would predict that if there were differences between the power of reform programs to promote belief change in settings that were relatively more or less blatantly coercive and physically threatening, the effect would be greatest in less-coercive programs. Consistent with this expectation, Lifton concluded that reform efforts directed against Chinese citizens were "much more successful" than efforts directed against Westerners (Lifton 1961, p. 400).
A second myth concerns the purported effects of brainwashing. Media reports about thought reform's effects far exceed the findings of scientific studies—which show coercive persuasion's upper limit of impact to be that of inducing personal confusion and significant, but typically transitory, attitude change. Brainwashing was promoted as capable of stripping victims of their capacity to assert their wills, thereby rendering them unable to resist the orders of their controllers. People subjected to "brainwashing" were not merely influenced to adopt new attitudes but, according to the myth, suffered essentially an alteration in their psychiatric status from normal to pathological, while losing their capacity to decide to comply with or resist orders.
This lurid promotion of the power of thought-reforming influence techniques to change a person's capacity to resist direction is entirely without basis in fact: No evidence, scientific or otherwise, supports this proposition. No known mental disorder produces the loss of will that is alleged to be the result of brainwashing. Whatever behavior and attitude changes result from exposure to the process, they are most reasonably classified as the responses of normal individuals to a complex program of influence.
The U.S. Central Intelligence Agency seems to have taken seriously the myth about brainwashing's power to destroy the will. Due, perhaps, to concern that an enemy had perfected a method for dependably overcoming will—or perhaps in hope of being the first to develop such a method—the Agency embarked on a research program, code-named MKULTRA. It was a pathetic and tragic failure. On the one hand, it funded some innocuous and uncontroversial research projects; on the other, it funded or supervised the execution of several far-fetched, unethical, and dangerous experiments that failed completely (Marks 1979; Thomas 1989).
Although no evidence suggests that thought reform is a process capable of stripping a person of their will to resist, a relationship does exist between thought reform and changes in psychiatric status. The stress and pressure of the reform process cause some percentage of psychological casualties. To reduce resistence and to motivate behavior change, thought-reform procedures rely on psychological stressors, induction of high degrees of emotional distress, and on other intrinsically dangerous influence techniques (Heide and Borkovec 1983). The process has a potential to cause psychiatric injury, which is sometimes realized. The major early studies (Hinkle and Wolfe 1961; Lifton 1961; Schein 1961) reported that during the unfreezing phase individuals were intentionally stressed to a point at which some persons displayed symptoms of being on the brink of psychosis. Managers attempted to reduce psychological pressure when this happened, to avoid serious psychological injury to those obviously near the breaking point.
Contemporary programs speed up the reform process through the use of more psychologically sophisticated and dangerous procedures to accomplish destabilization. In contemporary programs the process is sometimes carried forward on a large group basis, which reduces the ability of managers to detect symptoms of impending psychiatric emergencies. In addition, in some of the "therapeutic" ideologies espoused by thought-reforming organizations, extreme emotional distress is valued positively, as a sign of progress. Studies of contemporary programs have reported on a variety of psychological injuries related to the reform process. Injuries include psychosis, major depressions, manic episodes, and debilitating anxiety (Glass, Kirsch, and Parris 1977, Haaken and Adams 1983, Heide and Borkovec 1983; Higget and Murray 1983; Kirsch and Glass 1977; Yalom and Lieberman 1971; Lieberman 1987; Singer and Ofshe 1990; Singer and Lalich 1995, 1996).
HIGH CONTROL GROUPS AND LARGE GROUP AWARENESS TRAINING
Political reeducation in China was backed by the power of the state to request, if not to compel participation. Contemporary examples of the use of extreme influence that rise to the level of attempting to induce a major shift in belief typically lack the power to coerce someone into remaining involved long enough for the persuasive procedures of the program to unfreeze, change, and refreeze a person's values, perceptions, and preferences. Rather, participants must be attracted to the group and participate long enough for the tactics utilized during the unfreezing phase to destabilize the person's sense of identity so that he or she will be motivated to adopt the new world view or the new perceptions promoted by the program's managers, will desire to maintain association with committed believers in the new perspective, and will thereby obtain the long-term social support necessary to restabilize his or her identity.
Contemporary thought-reform programs are generally far more sophisticated in their selection of both destabilization and influence techniques than were the programs studied during the 1950s (Ofshe and Singer 1986; Singer and Lalich 1995, 1996). For example, hypnosis was entirely absent from the first programs studied but is often observed in modern programs. In most modern examples in which hypnosis is present, it functions as a remarkably powerful technique for manipulating subjective experience and for intensifying emotional response. It provides a method for influencing people to imagine impossible events such as those that supposedly occurred in their "past lives," the future, or during visits to other planets. If persons so manipulated misidentify the hypnotically induced fantasies, and classify them as previously unavailable memories, their confidence in the content of a particular ideology can be increased (Bainbridge and Stark 1980).
Hypnosis can also be used to lead people to allow themselves to relive actual traumatic life events (e.g., rape, childhood sexual abuse, near-death experiences, etc.) or to fantasize the existence of such events and, thereby, stimulate the experience of extreme emotional distress. When imbedded in a reform program, repeatedly leading the person to experience such events can function simply as punishment, useful for coercing compliance.
Accounts of contemporary programs also describe the use of sophisticated techniques intended to strip away psychological defenses, to induce regression to primitive levels of coping, and to flood targets with powerful emotion (Ayalla 1998; Haaken and Adams 1983; Hockman 1984; Temerlin and Temerlin 1982; Singer and Lalich 1996). In some instances stress and fatigue have been used to promote hallucinatory experiences that are defined as therapeutic (Gerstel 1982). Drugs have been used to facilitate disinhibition and heightened suggestibility (Watkins 1980). Thought-reform subjects have been punished for disobedience by being ordered to self-inflict severe pain, justified by the claim that the result will be therapeutic (Bellack et al. v. Murietta Foundation et al.).
Programs attempting thought reform appear in various forms in contemporary society. They depend on the voluntary initial participation of targets. This is usually accomplished because the target assumes that there is a common goal that unites him or her with the organization or that involvement will confer some benefit (e.g., relief of symptoms, personal growth, spiritual development, etc.). Apparently some programs were developed based on the assumption that they could be used to facilitate desirable changes (e.g., certain rehabilitation or psychotherapy programs). Some religious organizations and social movements utilize them for recruitment purposes. Some commercial organizations utilize them as methods for promoting sales. In some instances, reform programs appear to have been operated for the sole purpose of gaining a high degree of control over individuals to facilitate their exploitation (Ofshe 1986; McGuire and Norton 1988; Watkins 1980).
Virtually any acknowledged expertise or authority can serve as a power base to develop the social structure necessary to carry out thought reform. In the course of developing a new form of rehabilitation, psychotherapy, religious organization, utopian community, school, or sales organization, it is not difficult to justify the introduction of thought-reform procedures.
Perhaps the most famous example of a thought-reforming program developed for the ostensible purpose of rehabilitation was Synanon, a drug-treatment program (Sarbin and Adler 1970, Yablonsky 1965; Ofshe et al. 1974). The Synanon environment possessed all of Lifton's eight themes. It used as its principle coercive procedure a highly aggressive encounter/therapy group interaction. In form it resembled "struggle groups" observed in China (Whyte 1976), but it differed in content. Individuals were vilified and humiliated not for past political behavior but for current conduct as well as far more psychologically intimate subjects, such as early childhood experiences, sexual experiences, degrading experiences as adults, etc. The coercive power of the group experience to affect behavior was substantial as was its ability to induce psychological injury (Lieberman, Yalom, and Miles 1973; Ofshe et al. 1974).
Allegedly started as a drug-rehabilitation program, Synanon failed to accomplish significant long-term rehabilitation. Eventually, Synanon's leader, Charles Diederich, promoted the idea that any degree of drug abuse was incurable and that persons so afflicted needed to spend their lives in the Synanon community. Synanon's influence program was successful in convincing many that this was so. Under Diederich's direction, Synanon evolved from an organization that espoused nonviolence into one that was violent. Its soldiers were dispatched to assault and attempt to murder persons identified by Diederich as Synanon's enemies (Mitchell, Mitchell, and Ofshe 1981).
The manipulative techniques of self-styled messiahs, such as People's Temple leader Jim Jones (Reiterman 1982), and influence programs operated by religious organizations, such as the Unification Church (Taylor 1978) and Scientology (Wallis 1977; Bainbridge and Stark 1980), can be analyzed as thought-reform programs. The most controversial recruitment system operated by a religious organization in recent American history was that of the Northern California branch of the Unification Church (Reverend Moon's organization). The influence program was built directly from procedures of psychological manipulation that were commonplace in the human-potential movement (Bromley and Shupe 1981). The procedures involved various group-based exercises as well as events designed to elicit from participants information about their emotional needs and vulnerabilities. Blended into this program was content intended to slowly introduce the newcomer to the group's ideology. Typically, the program's connection with the Unification Church or any religious mission was denied during the early stages of the reform process. The target was monitored around the clock and prevented from communicating with peers who might reinforce doubt and support a desire to leave. The physical setting was an isolated rural facility far from public transportation.
Initial focus on personal failures, guilt-laden memories, and unfulfilled aspirations shifted to the opportunity to realize infantile desires and idealistic goals, by affiliating with the group and its mission to save the world. The person was encouraged to develop strong affective bonds with current members. They showed unfailing interest, affection, and concern, sometimes to the point of spoon-feeding the person's meals and accompanying the individual everywhere, including to the toilet. If the unfreezing and change phases of the program succeeded, the individual was told of the group's affiliation with the Unification Church and assigned to another unit of the organization within which refreezing procedures could be carried forward.
POLICE INTERROGATION AND FALSE CONFESSIONS
Police interrogation in America was transformed in the twentieth century from a method of gaining compliance relying largely on physical coercion into one that relies almost exclusively on psychological means (Leo 1992). Psychological methods of interrogation were developed to influence persons who know they are guilty of a crime to change their decision to deny guilt, to admit responsibility for the crime, and to confess fully to their role in the crime (see Hilgendorf and Irving 1981; Ofshe and Leo 1997a,b). Compared to the other extreme influence environments described in this entry, a police interrogation is of relatively short duration and is entirely focused on the single issue of eliciting a confession. It is nevertheless an environment in which a person can be influenced to make a dramatic shift in position—from denial of guilt to confession.
The techniques and tactics that lead a guilty suspect to admit guilt constitute an impressive display of the power of influence to change a person's decision even when the consequence of the shift is obviously disadvantageous. If the procedures of interrogation are misused, modern interrogation methods can have an even more impressive result. If the influence procedures and techniques of modern interrogation methods are directed at innocent persons, some false confessions will result (Bedau and Radelet 1987; Leo and Ofshe 1998). In most instances when an innocent person is led to give a false confession the cause is coercion—the use of a threat of severe punishment if the person maintains that he is innocent and an offer of leniency if he complies with the interrogator's demand to confess. Most persons who decide to comply and offer a false confession in response to coercion remain certain of their innocence and know that they are falsely confessing in order to avoid the most severe possible punishment.
Under some circumstances, however, interrogation tactics can cause an innocent person to give what is called a persuaded false confession—a false confession that is believed to be true when it is given. Influence procedures now commonly used during modern police interrogation can sometimes inadvertently manipulate innocent persons' beliefs about their own innocence and, thereby, cause them to falsely confess. Confessions resulting from accomplishing the unfreezing and change phases of thought reform are classified as persuaded false confessions (Kassin and Wrightsman 1985; Gudjonsson and MacKeith 1988; Ofshe and Leo 1997a). Although they rarely come together simultaneously, the ingredients necessary to elicit a temporarily believed false confession are: erroneous police suspicion, the use of certain commonly employed interrogation procedures, and some degree of psychological vulnerability in the suspect. Philip Zimbardo (1971) has reviewed the coercive factors generally present in modern interrogation settings. Richard Ofshe and Richard Leo (1989, 1997a) have identified those influence procedures that if present in a suspect's interrogation contribute to causing unfreezing and change.
Techniques that contribute to unfreezing include falsely telling a suspect that the police have evidence proving the person's guilt (e.g., fingerprints, eyewitness testimony, etc.). Suspects may be given a polygraph examination and then falsely told (due either to error or design) that they failed and the test reveals their unconscious knowledge of guilt. Suspects may be told that their lack of memory of the crime was caused by an alcohol- or drug-induced blackout, was repressed, or is explained because the individual is a multiple personality.
The techniques listed above regularly appear in modern American police interrogations. They are used to lead persons who know that they have committed the crime at issue to decide that the police have sufficient evidence to convict them or to counter typical objections to admitting guilt (e.g., "I can't remember having done that."). In conjunction with the other disorienting and distressing elements of a modern accusatory interrogation, these tactics can sometimes lead innocent suspects to doubt themselves and question their lack of knowledge of the crime. If innocent persons subjected to these sorts of influence techniques do not reject the false evidence and realize that the interrogators are lying to them, they have no choice but to doubt their memories. If the interrogator supplies an explanation for why the suspect's memory is untrustworthy, the person may reason that "I must have committed this crime."
Tactics used to change the suspect's position and elicit a confession include maneuvers designed to intensify feelings of guilt and emotional distress following from the suspect's assumption of guilt. Suspects may be offered an escape from the emotional distress through confession. It may also be suggested that confession will provide evidence of remorse that will benefit the suspect in court.
RECOVERED MEMORY PSYCHOTHERAPY
The shifts in belief and conduct promoted during political reeducation concern an intellectual analysis of society and one's role it; in the context of high control groups the beliefs are about theology and one's role in a community; and in police interrogation the newly created beliefs concern a crime the suspect has no knowledge of having committed. The progression in these illustrations is from influence efforts directed at an intellectual or philosophical assumption to that of an influence effort directed at changing beliefs about a single fact—Did the suspect commit a crime and not know of it due to a memory defect? The final example of extreme influence and belief change is arguably the most dramatic example of the power of interpersonal influence and particular influence techniques to change a person's beliefs about the historical truth relating to a major dimension of his or her life history, such as whether or not he or she had been viciously raped and horribly brutalized by a parent (or parents, or siblings, or teachers, or neighbors, etc.) for periods of as long as two decades.
Psychotherapy directed at causing a patient to retrieve allegedly repressed and therefore supposedly unavailable memories (e.g., of sexual abuse, and/or having spent one's life suffering from an unrecognized multiple personality disorder, and/or having spent one's childhood and teenage years as a member of a murderous satanic cult) is one of the most stunning examples of psychological/psychiatric quackery of the twentieth century. It is also perhaps the most potent example of the power of social influence to predictably create beliefs (in this case beliefs that are utterly false) and thereby alter a person's choices and conduct (Loftus 1994; Perdergrast 1995; Ofshe and Watters 1994; Spanos 1996).
The elements of a thought-reforming process are visible in the steps through which a patient undergoing recovered memory therapy is manipulated. The patient's identity is destabilized by the therapist's insistence that the reason for the patient's distress or mental illness is that she or he suffered a series of sexual traumas in childhood that he or she don't know about because he or she has repressed them. Recovered memory therapists organize their treatment programs on the presumption that repression exists despite the fact that the notion of repression was never more than a fanciful, unsubstantiated speculation that was long ago rejected as useless by the scientific community that studies human memory (Loftus 1994; Crews 1998; Watters and Ofshe 1999). Recovered memory therapists rely on assertions of authority, claims of expertise, and outright trickery (e.g., inducing the patient to experience hypnotic fantasies of sexual abuse that they lead patients to misclassify as memories and interpreting the patient's dreams as proof that they suffer from repressed memories) to convince the patient of the existence of repression. Recovered memory therapists rely on the alleged "repression mechanism" to undermine patients' confidence in their normal memories of their lives. If patients can be successfully convinced that repression exists and that the hypnotic fantasies that the therapist suggests to them are memories of events that happened during their lives, they can no longer trust that they know even the broad outlines of their life histories.
Once the patient's identity has been destabilized, the therapist guides the patients to build a new identity centered on his or her status as victim of sexual abuse. This victim role may include requiring the patient to learn to act as if she or he suffers from multiple personality disorder, or suggesting that the patient publicly denounce, sue, or file criminal charges against the persons who supposedly abused them.
CONCLUSION
Extreme influence environments are not easy to study. The history of research on extreme influence has been one in which most of the basic descriptive work has been conducted through posthoc interviewing of persons exposed to the influence procedures. The second-most frequently employed method has been that of participant observation. In connection with work being done on police interrogation methods, it has been possible to analyze contemporaneous recordings of interrogation sessions in which targets' beliefs are actually made to undergo radical change. All this work has contributed to the development of an understanding in several ways.
Studying these environments demonstrates that the extremes of influence are no more or less difficult to understand than any other complex social event. The characteristics that distinguish extreme influence environments from other examples of social settings are influence in the order in which the influence procedures are assembled and the degree to which the target's environment is manipulated in the service of social control. These are at most unusual arrangements of commonplace bits and pieces.
As it is with all complex, real-world social phenomena that cannot be studied experimentally, understanding information about the thought-reform process proceeds through the application of theories that have been independently developed. Explaining data that describe the type and organization of the influence procedures that constitute an extreme influence environment depends on applying established social-psychological theories about the manipulation of behavior and attitude change. Assessing reports about the impact of the experiences on the persons subjected to intense influence procedures depends on the application of current theories of personality formation and change. Understanding instances in which the thought-reform experience appears related to psychiatric injury requires proceeding as one would ordinarily in evaluating any case history of a stress-related or other type of psychological injury.
(see also: Attitudes; Persuasion; Social Control)
references
Ayalla, Marybeth 1998 Insane Therapy. Philadelphia: Temple University Press.
Bainbridge, William S., and Rodney Stark 1980 "Scientology, to Be Perfectly Clear." Sociological Analysis 41:128–136.
Bedau, Hugo, and Michael Radelet 1987 "Miscarriages of Justice in Potentially Capital Cases." Stanford University Law Review 40:21–197.
Bellack, Catherine et al. v. Murietta Foundation et al. United States District Court, Central District of California. Civil No. 87–08597.
Bem, Darryl 1972 "Self-Perception Theory." In Leonard Berkowitz, ed., Advances in Experimental Social Psychology, vol 6. New York: Academic.
Biderman, Albert D. 1960 "Social-Psychological Needs and Involuntary Behavior as Illustrated by Compliance in Interrogation." Sociometry 23:120–147.
Bromley, David G., and Anson D. Shupe, Jr. 1981 Strange Gods. Boston: Beacon Press.
Dornbush, Sanford M. 1955 "The Military Academy as an Assimilating Institution. Social Forces 33:316–321.
Farber, I. E., Harry F. Harlow, and Louis J. West 1956 "Brainwashing, Conditioning and DDD: Debility, Dependency and Dread." Sociometry 20:271–285.
Festinger, Leon 1957 A Theory of Cognitive Dissonance. Evanston, Ill.: Row Peterson.
Gerstel, David 1982 Paradise Incorporated: Synanon. Novato, Calif.: Presidio.
Glass, Leonard L., Michael A. Kirsch, and Frederick A. Parris 1977 "Psychiatric Disturbances Associated with Erhard Seminars Training: I. A Report of Cases." American Journal of Psychiatry 134:245–247.
Goffman, Erving 1957 "On the Characteristics of Total Institutions." Proceedings of the Symposium on Preventive and Social Psychiatry. Washington, D.C.: Walter Reed Army Institute of Research.
Gudjonsson, Gisli, and Bent Lebegue 1989 "Psychological and Psychiatric Aspects of a Coerced-Internalized False Confession." Journal of Forensic Science 29:261–269.
Gudjonsson, Gisli H., and James A. MacKeith 1988 "Retracted Confessions: Legal, Psychological and Psychiatric Aspects." Medical Science and Law 28: 187–194.
Haaken, Janice, and Richard Adams 1983 "Pathology as 'Personal Growth': A Participant Observation Study of Lifespring Training." Psychiatry 46:270–280.
Heide, F. J., and T. D. Borkovec 1984 "Relaxation Induced Anxiety: Mechanism and Theoretical Implications." Behavior Research and Therapy 22:1–12.
Higget, Anna C., and Robin M. Murray 1983 "A Psychotic Episode Following Erhard Seminars Training." Acta Psychiatria Scandinavia 67:436–439.
Hilgendorf, Edward, and Barrie Irving 1981 "A Decision-Making Model of Confession." In S.M.A. Loyd-Bostock, ed., Psychology in Legal Contexts. London: Macmillan.
Hinkle, L. E., and Harold G. Wolfe 1956 "Communist Interrogation and Indoctrination of Enemies of the State." Archives of Neurology and Psychiatry 20:271–285.
Hochman, John A. 1984 "Iatrogenic Symptoms Associated with a Therapy Cult: Examination of an Extinct 'New Therapy' with Respect to Psychiatric Deterioration and 'Brainwashing'." Psychiatry 47:366–377.
Hulme, Kathryn 1956 The Nun's Story. Boston: Little, Brown.
Hunter, Edward 1951 Brain-washing in China. New York: Vanguard.
Kassin, Saul, and Lawrence Wrightsman 1985 "Confession Evidence." In Samuel Kassin and Lawrence Wrightsman, eds., The Psychology of Evidence and Trial Procedure. London: Sage.
Kirsch, Michael A., and Leonard L. Glass 1977 "Psychiatric Disturbances Associated with Erhard Seminars Training: II. Additional Cases and Theoretical Considerations." American Journal of Psychiatry 134: 1254–1258.
Leo, Richard 1992 "From Coercion to Deception: The Changing Nature of Police Interrogation in America." Journal of Criminal Law and Social Change 88, 27–52.
——and Richard Ofshe 1998 "The Consequences of False Confessions. Journal of Criminal Law and Criminology 88, 2:429–496.
Lieberman, Morton A. 1987 "Effect of Large Group Awareness Training on Participants' Psychiatric Status." American Journal of Psychiatry, 144:460–464.
——, Irvin D. Yalom, and M. B. Miles 1973 Encounter Groups: First Facts. New York: Basic Books.
Lifton, Robert J. 1954 "Home by Ship: Reaction Patterns of American Prisoners of War Repatriated from North Korea." American Journal of Psychiatry 110:732–739.
——1961 Thought Reform and the Psychology of Totalism. New York: Norton.
——1986 The Nazi Doctors. New York: Basic Books.
——1987 "Cults: Totalism and Civil Liberties." In Robert J. Lifton, ed., The Future of Immortality and Other Essays for a Nuclear Age. New York: Basic Books.
Loftus, Elizabeth, and Katherine Ketcham 1994 The Myth of Repressed Memory. New York; St. Martins.
Marks, John 1979 The Search for the Manchurian Candidate. New York: Dell.
McGuire, Christine, and Carla Norton 1988 Perfect Victim. New York: Arbor House.
Meerloo, Jorst A. 1956 The Rape of the Mind: The Psychology of Thought Control, Menticide and Brainwashing. Cleveland, Ohio: World Publishing.
Mitchell, David V., Catherine Mitchell, and Richard J. Ofshe 1981 The Light on Synanon. New York: Seaview.
Ofshe, Richard J. 1980 "The Social Development of the Synanon Cult: The Managerial Strategy of Organizational Transformation." Sociological Analysis 41: 109–127.
——1986 "The Rabbi and the Sex Cult." Cultic Studies Journal 3:173–189.
——1989 "Coerced Confessions: The Logic of Seemingly Irrational Action." Cultic Studies Journal 6:1–15.
——1992 "Inadvertant Hypnosis During Interrogation: False Confession Due to Dissociative State; Mis-Identified Multiple Personality and the Satanic Cult Hypothesis." International Journal of Clinical and Experimental Hypnosis 40, (3):125–156.
——, and Richard Leo 1997a "The Social Psychology of Police Interrogation: The Theory and Classification of True and False Confessions." Studies in Law, Politics and Society 16: 190–251.
——, 1997b "The Decision to Confess Falsely: Rational Choice and Irrational Action." Denver University Law Review 74, 4:979–1122.
Ofshe, Richard and Margaret T. Singer 1986 "Attacks on Peripheral Versus Central Elements of Self and the Impact of Thought Reforming Techniques." Cultic Studies Journal 3:3–24.
——, Nancy Eisenberg, Richard Coughlin, and Gregory Dolinajec 1974 "Social Structure and Social Control in Synanon." Voluntary Action Research 3:67–76.
Ofshe, Richard, and Ethan Watters 1994 Making Monsters. New York: Scribners.
Perdergrast, Mark 1995 Victims of Memory. Hinesberg, Vt.: Upper Access.
Reiterman, Timothy, and Dan Jacobs 1982 The Raven. New York: Dutton.
Sarbin, Theodore R., and Nathan Adler 1970 "Self-Reconstitution Processes: A Preliminary Report." Psychoanalytic Review 4:599–616.
Sargent, William 1957 Battle for the Mind: How Evangelists, Psychiatrists, and Medicine Men Can Change Your Beliefs and Behavior. Garden City, N.Y.: Doubleday.
Schein, Edgar W. 1961 Coercive Persuasion. New York: Norton.
——, W. E. Cooley, and Margaret T. Singer 1960 A Psychological Follow-up of Former Prisoners of the Chinese Communists, Part I, Results of Interview Study. Cambridge: MIT.
Shurmann, Franz 1968 Ideology and Organization in Communist China. Berkeley: University of California Press.
Singer, Margaret, and Janja Lalich 1995 Cults in Our Midst. San Francisco: Jossey-Bass.
——1996 Crazy Therapies. San Francisco: Jossey-Bass.
Singer, Margaret T., and Richard J. Ofshe 1990 "Thought Reform Programs and the Production of Psychiatric Casualties." Psychiatric Annals 20:188–193.
Taylor, David 1978 "Social Organization and Recruitment in the Unification Church." Master's diss., University of Montana.
Temerlin, Maurice K., and Jane W. Temerlin 1982 "Psychotherapy Cults: An Iatrogenic Phenomenon." Psychotherapy Theory, Research Practice 19:131–41.
Thomas, Gordon 1989 Journey Into Madness. New York: Bantam.
Wallis, Roy 1977 The Road to Total Freedom. New York: Columbia University Press.
Watkins, Paul 1980 My Life With Charles Manson. New York: Bantam.
Whyte, Martin K. 1976 Small Groups and Political Behavior in China. Berkeley: University of California Press.
Wright, Stewart 1987 Leaving Cults: The Dynamics of Defection. Society of the Scientific Study of Religion, Monograph no. 7, Washington, D.C.
Yablonski, Louis 1965 The Tunnel Back: Synanon. New York: Macmillan.
Yalom, Irvin D., and Morton Lieberman 1971 "A Study of Encounter Group Casualties." Archives of General Psychiatry 25:16–30.
Zablocki, Benjamin 1991 The Scientific Investigation of the Brainwashing Conjecture. Washington D.C.: American Association for the Advancement of Science.
Zimbardo, Philip G. 1971 "Coercion and Compliance." In Charles Perruci and Mark Pilisuk, eds., The Triple Revolution. Boston: Little, Brown.
Richard J. Ofshe