Visitors Now:
Total Visits:
Total Stories:
Profile image
By Anonymous (Reporter)
Contributor profile | More stories
Story Views

Now:
Last Hour:
Last 24 Hours:
Total:

Tyranny Enabled Through Identification, Not Passive Conformity, New Analysis Shows

Friday, November 23, 2012 16:51
% of readers think this story is Fact. Add your two cents.

(Before It's News)

This post comes to us from Planetsave.com. For more along these lines, visit Planetsave or some of its most popular categories: Global Warming, Science, Going Green Tips, Animals, or 10 Friday Photos.

Nuremberg trials, defendantsThe idea that atrocities result from “unblinking obedience” to authority has become so ingrained in our social consciousness that we rarely ever question it.

Indeed, more recently, popular culture has adopted the phrase “Resistance is futile” – taken from Star Trek TNG‘s ‘borg’ entity mantra — to encapsulate, even enshrine, our belief in how all-consuming is our capitulation to authority, and consequently, the emergence of tyranny.

Some innate — perhaps primordial — instinct for passivity and conformity in the presence of authority figures is posited as the predominant (academic) explanation for the rise of tyranny in societies. Most scholars on the subject accept that we are “programmed to obey.”

But new research drawing upon evidence from the historical record and two ‘classic’ experiments offers a compelling argument that passive conformity is neither inevitable nor the cause of our oppression of and brutality towards our fellow humans.

Instinctive Conformity as Cause of Tyranny – Exploring The Roots

This pervasive societal belief has it roots in two ‘classic’ social-psychology experiments from the 1960′s and 1970′s.

In the earlier decade (1963), we have the (in)famous experiments conducted by Stanley Milgram in which he sought to reveal how the horrors of Nazi concentration camps could have happened.

In Milgram’s studies, mostly male volunteers — called ‘teachers’  — were told to administer a memory test to subjects (‘learners’) and punish them — by applying increasing levels of electric shocks — when they got the answers wrong. The teachers controlled a panel of switches that administered the electric shocks — ranging from a mere 15 volts up to 450 volts.

Disturbingly, and with only the lab-coated, clipboard-wielding Milgram standing by, urging them on, all of the volunteers (in one trial) continued to give more intense shocks (up to 300v) to the test subject even though they could hear his screams of pain and pleas to stop. What’s more, 65% of the ‘teachers’ went all the way to 450v (in the ‘baseline’ study).

Unbeknownst to the teachers, the test subjects (‘learners’) were actually confederates/actors who were cued when to react to the (fake) “electric shock”.

Milgram wrote about these experiments in his book Obedience to authority: an experimental view (1974).

In the following decade (1973), we have the equally infamous ‘Stanford Prison Experiment’ conducted by Paul Zimbardo and Haney Banks (funded by the US Office of Naval Research) in which student volunteers were randomly divided into prisoners and guards in an simulated “prison” laboratory in the basement of one of the Psychology Department building. In contrast to Milgram’s study, Zimbardo and Banks wanted to observe the interactions between two groups in the absence of any “malevolent” authority.

The participants readily assumed their respective ‘dominant’ and ‘submissive’ roles, with the guards adopting beatings and other brutal forms of physical abuse to manage the prisoners. This abuse became so egregious that the study was terminated after just 6 days.

The researchers’ conclusion was even more alarming that Milgram’s: brutality was ““a ‘natural’ consequence of being in the uniform of a ‘guard’ and asserting the power inherent in that role”. People do not need any specific orders (or the physical presence of authorities) to evolve a tyrannical culture — they conform “unthinkingly” to the roles prescribed by authorities.

Zimbardo writes about this experiment in his 2007 book The Lucifer effect: how good people turn evil.

These two classic studies see m to offer substantive proof of “the banality of evil” *; people blinded conform to their assigned roles and the instructions given to them by those in power. it is, it seems, the tragedy of human nature: we would rather be “good subjects than subject who do good.”

But is that all there is to understand here? Is there another, more compelling interpretation of these studies?

Conformity Isn’t Natural and It Doesn’t Explain Tyranny

Revisiting these classic experiments, researchers Haslam and Reicher (2012, see citation below) offer a new analysis that sheds greater light on our tendency to conform to the will of authority.

In their analysis, the researchers challenge this consensus through empirical studies based upon Social Identity Theory. In fact, argue the researchers, their re-interpretation of the results/conclusions from these two studies derive from evidence found in the studies themselves. This evidence, they assert, supports an entirely different interpretation of the ‘psychology of conformity’.

Their analysis suggests that conformity is neither natural nor the (sole) cause of tyranny. Rather, our willingness to obey authority is conditioned upon our identification with the authority, and, “an associated belief that the authority is right.”

Drawing upon their own analysis and previous examinations of historical atrocities (most specifically, the Nazi’s attempted “eliminationist” campaign, Haslam and Reicher first note that the others have questioned the commonly held belief that Nazi bureaucrats/officials were ever “just following orders”.

Quoting from their recently published PLoS Biology paper:

“This may have been the defense they relied upon when seeking to minimize their culpability, but evidence suggests that functionaries like Eichmann had a very good understanding of what they were doing and took pride in the energy and application that they brought to their work. Typically too, roles and orders were vague, and hence for those who wanted to advance the Nazi cause (and not all did), creativity and imagination were required in order to work towards the regime’s assumed goals and to overcome the challenges associated with any given task. Emblematic of this, the practical details of “the final solution” were not handed down from on high, but had to be elaborated by Eichmann himself. He then felt compelled to confront and disobey his superiors—most particularly Himmler—when he believed that they were not sufficiently faithful to eliminationist Nazi principles.”

The researchers then turn to our two classic psychology experiments, noting similarities in these experiments to the historical account of the Nazi campaign.

As to the Standford prison experiment, they note:

“So while it may be true that Zimbardo gave his guards no direct orders, he certainly gave them a general sense of how he expected them to behave. During the orientation session he told them, amongst other things, ‘You can create in the prisoners feelings of boredom, a sense of fear to some degree, you can create a notion of arbitrariness that their life is totally controlled by us, by the system, you, me… We’re going to take away their individuality in various ways. In general what all this leads to is a sense of powerlessness.’

This contradicts Zimbardo’s assertion that ‘behavioral scripts associated with the oppositional roles of prisoner and guard [were] the sole source of guidance’ and leads us to question the claim that conformity to these role-related scripts was the primary cause of guard brutality.”

Haslam and Reicher also note that not all of the guards acted brutally towards the prisoners, and, that those who did act brutally “used ingenuity and initiative in responding to Zimbardo’s brief” (that is, his prep talk to participants before the study started). Indeed, after the study was terminated, one of the student ‘prisoners’ confronted one of the guards that brutalized him, saying: “If I had been a guard I don’t think it would have been such a masterpiece.”

According to the authors, this contradicts the ingrained notion of the “banality of evil”, insofar as the emergent tyranny of the guards was “was made possible by the active engagement of enthusiasts rather than the leaden conformity of automatons.” (they also allude to the subjects being “inspired” by Zimbardo).

As to the seminal experiments by Stanley Milgram, Haslam and Reicher first note that the measured action here — the flipping of a switch to apply the shock — offered no room for variation on the part of the teachers (unlike Zimbardo’s subjects who had far more “creative” choices in which to carry out their roles).

More importantly, they point out that many subsequent researchers failed to draw upon Milgram’s entire body of work regarding these experiments, and in so doing, drew the wrong conclusions from them.

Quote:

“…it is clear that the ‘baseline study’ is not especially typical of the 30 or so variants of the paradigm that Milgram conducted. Here the percentage of participants going to 450 V varied from 0% to nearly 100%, but across the studies as a whole, a majority of participants chose not to go this far.”

The researchers here took a closer look at Milgram’s sessions and observed that subjects (the ‘teachers’) experienced various degrees of mental anguish due to being morally torn between the irreconcilable demands of two persons: Milgram’s proddings to continue and the screams of the ‘learners’ who got the electric shocks.

The authors state:

“They sweat, they laugh, they try to talk and argue their way out of the situation. But the experimental set-up does not allow them to do so. Ultimately, they tend to go along with the Experimenter if he justifies their actions in terms of the scientific benefits of the study (as he does with the prod “The experiment requires that you continue”). But if he gives them a direct order (“You have no other choice, you must go on”) participants typically refuse.” [emphasis added]

Here again, they note, the consensus view is cast into doubt; study subjects did what they did not out of blind conformity, but rather, a belief that what they were doing was important.

Tyranny as a Product of Identification-based ‘Followership’

In questioning the ‘banality of evil’ paradigm, the authors draw upon their own more recent research with the BBC Prison Study, which sought to explore the dynamics of guards and prisoners more thoroughly. In this follow up to the Standford experiment, the researchers took no leadership role (like Zimbardo), desiring to discover if participants would conform (to the “hierarchical script”) or resist it.

From this latter-day prison experiment, three findings resulted. According to Haslam and Reicher:

“First, participants did not conform automatically to their assigned role. Second, they only acted in terms of group membership to the extent that they actively identified with the group (such that they took on a social identification). Third, group identity did not mean that people simply accepted their assigned position; instead, it empowered them to resist it. Early in the study, the Prisoners’ identification as a group allowed them successfully to challenge the authority of the Guards and create a more egalitarian system.”

But they also note that:

“Later on, though, a highly committed group emerged out of dissatisfaction with this system and conspired to create a new hierarchy that was far more draconian.”

In summing up the findings from the BBC Prison Study, the authors assert that neither “passive conformity” to assigned roles, nor “blind obedience” to any rules, could account for the observed behaviors.

“…it was only when they had internalized roles and rules as aspects of a system with which they identified that participants used them as a guide to action. Moreover, on the basis of this shared identification, the hallmark of the tyrannical regime was not conformity but creative leadership and engaged followership within a group of true believers.”

Similarly, in the Milgram experiments, it was identification with the scientific enterprise, as defined by Milgram, and a commitment to both the experiment and the Experimenter that served as the mechanism of compliance with Milgram’s “prods”. This commitment stood above any commitment the teachers may have felt towards the ‘learners’  — or the general community (i.e., its norms, ethics and values).

Indeed, in a post-study debriefing by Milgram (in which he praised study subjects for their compliance even though it caused physical “discomfort” in the learners), participants expressed happiness that they had “been of service” and had contributed to an experiment from which “good” might come. Many expressed their support of future experiments of the kind.

Haslam and Reicher also note, significantly, that the degree of identification (with the Experimenter, or with the larger community) was not the same or constant across all variants of this study, especially if one changed the environment in which the study took place (a prestigious university verses some commercial/private-sector facility).

“More systematically, we have examined variations in participants’ identification with the Experimenter and the science that he represents as opposed to their identification with the Learner and the general community. They always identify with both to some degree—hence the drama and the tension of the paradigm. But the degree matters, and greater identification with the Experimenter is highly predictive of a greater willingness among Milgram’s participants to administer the maximum shock across the paradigm’s many variants.”

Concluding Thoughts: Beyond the ‘Lucifer Effect’

Haslam and Reicher argue that understanding the emergence of tyranny requires a deeper understanding of the situation and context in which this tyranny arises. Specifically, social and psychological scientists need to examine two inter-related phenomena, or processes, that come into play whenever people are subject to the directives of authority figures.

In their words:

“To understand tyranny, then, we need to transcend the prevailing orthodoxy that this derives from something for which humans have a natural inclination—a “Lucifer effect” to which they succumb thoughtlessly and helplessly (and for which, therefore, they cannot be held accountable). Instead, we need to understand two sets of inter-related processes: those by which authorities advocate oppression of others and those that lead followers to identify with these authorities.”

While acknowledging that the consensus interpretation of these two experiments remains influential, the researchers emphasize that contrary evidence (observed within each experimental dynamic) has been ignored, even in studies that seem to support the idea that conformity (to authoritative directives) is “inevitable”.

Auschwitz, Hungarian Jews

Auschwitz concentration camp, arrival of Hungarian Jews, Summer 1944

Further, they assert that this traditional, consensus view also ignores the clear evidence that

“…those who do heed authority in doing evil do so knowingly not blindly, actively not passively, creatively not automatically. They do so out of belief not by nature, out of choice, not by necessity. In short, they should be seen—and judged—as engaged followers not as blind conformists.”

The fundamental assertion of this analysis is that tyranny flourishes, not because followers of authority are helpless to do otherwise, rather, because these follower identify with those authorities who “promote vicious acts as virtuous.” Once achieved, this identification makes participants work more enthusiastically, and creatively, to ensure the success of the tyrannical directive.

It is both surprising, and predictable, that those who commit such acts “actively wish to be held accountable—so long as it secures the approbation of those in power.” [emphasis added]

* This term was coined in Hannah Arendt’s account of the trial of Adolf Eichmann, a chief architect of the Nazis’ “final solution to the Jewish question”. Despite being responsible for the transportation of millions of people to their death, Arendt suggested that Eichmann was no psychopathic monster. Instead his trial revealed him to be a diligent and efficient bureaucrat—a man more concerned with following orders than with asking deep questions about their morality or consequence. [source: the authors]

Citation for this article:

Contesting the “Nature” Of Conformity: What Milgram and Zimbardo’s Studies Really Show S. Alexander Haslam, Stephen. D. Reicher  PLoS Biology: Essay, published 20 Nov 2012 10.1371/journal.pbio.1001426

Top Photo:Nuremberg Trials. Defendants in the dock. The main target of the prosecution was Hermann Göring (at the left edge on the first row of benches), considered to be the most important surviving official in the Third Reich after Hitler’s death.

Bottom Photo (Auschwitz concentration camp, arrival of Hungarian Jews, Summer 1944) ; Bundesarchiv, Bild 183-N0827-318 / CC-BY-SA








Source:

Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

Top Stories
Recent Stories

Register

Newsletter

Email this story
Email this story

If you really want to ban this commenter, please write down the reason:

If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.