FEATURES

Zimbardo Unbound

Long after his notorious prison experiment and soon after the Abu Ghraib scandal, the famous psychologist lobbies for a greater understanding of how evil systems subvert good people.

May/June 2007

Reading time min

Zimbardo Unbound

Photo: Glenn Matsumura

No matter what Philip Zimbardo does—publish research on shyness, time perspective or madness; teach wildly popular undergraduate courses; co-author the bestselling textbook Psychology and Life; star in the PBS series Discovering Psychology; or serve as president of the American Psychological Association—everybody remembers him for the famous (and infamous) 1971 Stanford Prison Experiment. The mock prison he’d set up in the basement of Jordan Hall quickly turned ordinary college students into abusive guards and degraded prisoners, some of whom broke down under the pretend prison’s all-too-real strain. When, decades later, Zimbardo first glimpsed televised images of the inmate abuse at Abu Ghraib, he was shocked by similarities to what he’d seen in his own study.

Before long, Zimbardo, 74, was not only giving media interviews, but serving as an expert witness in defense of Ivan “Chip” Frederick, a staff sergeant who was the highest-ranking officer court-martialed for the crimes at the Iraqi prison. This role gave Zimbardo access to documents detailing horrendous conditions at the prison—and the evidence he needed to dispute official claims that the sadistic treatment of Iraqi detainees was an isolated incident and the work of a few rogue soldiers.

In a new book, The Lucifer Effect: Understanding How Good People Turn Evil (Random House), Zimbardo makes the case that “bad apples” aren’t to blame for evils at Abu Ghraib and elsewhere: he argues that extreme situations and the systems that create them—“bad barrels”—lead ordinary people to behave in horrid ways.

On March 7, roughly coinciding with his golden teaching anniversary and the publication of the book, the psychology professor gave a farewell Stanford lecture. In the packed auditorium, the veteran showman’s presentation combined psychological research with real-world politics, leavened a heavy message with personal history and popular culture, and elicited both despair and optimism about human nature. The centerpiece: a series of snapshots from Abu Ghraib.

It’s easy to loathe the soldiers gloating over their atrocities—Zimbardo calls the photos “trophy shots,” likening them to fishermen’s poses with their big catch. But when Zimbardo describes the hellish, decrepit prison—in which the guards lived in conditions little better than those for the inmates—the soldiers’ actions gain new context. Under frequent attack by mortar fire, enveloped in desert heat and urine stench, the guards worked 12-hour shifts for weeks without respite, with insistent but vague orders to “soften up” for interrogation their prisoners of war.

You felt sympathetic toward Chip Frederick after getting to know him, but do you feel as much for the other abusers? 

I haven’t studied any of the others in sufficient detail, but they were all pawns of chess masters who orchestrated their game remotely and with no understanding or concern for the humanity of these soldiers or the dignity of their prisoners.

The supposed worst of the other MPs was corporal Charles Graner. In the middle of the abuses he got accolades from his lieutenant colonel for the work he was doing in preparing the detainees for interrogation. I am sympathetic that he then was sent away for 10 years after doing this acclaimed duty.

You write in your book, “There are no special inner attributes of either pathology or goodness residing within the human psyche or the human genome.” Isn’t this an empirical question—something for behavioral geneticists to answer? 

Behavioral genetics cannot deal with highly complex behaviors, and certainly not generic ones like good and evil. There is no data of genes that predispose toward good or evil, and any such data would be so weak as to apply [only] to a minority of cases. If your mother and father were both schizophrenic, the probability you will be is only 50 percent.

You contend that abuses, those at Abu Ghraib and those in your prison study, started with good people—people who had passed psychological evaluations. But can we know that they were good? Maybe the psychological tests aren’t getting at something important.

They’re not. All that personality tests can do is predict how you’ll behave in situations that you’re familiar with. They can’t predict how you’d act in a totally new situation. But the tests do tell us that at the time these people completed them, they fit in the normal range of all people taking the tests. So when we put these good guards in a bad place, the place changed their personality in ways they couldn’t imagine.

You make the classic distinction in social psychology between the person and the situation, but you also bring the system into it. What made you think about that third piece?

I was unaware even with the Stanford prison study about the power of the system because I was the system. It wasn’t until I was preparing for Chip Frederick’s trial by reading these investigative reports that I said, “Oh, my God, what’s really important isn’t how terrible the situation was in Tier 1A, but how is it possible that any military system allowed this terrible environment to exist?” So I began to say the most important thing is the cruel and inhuman system, because the system is where the power is.

I should have thought of it earlier: my major was sociology and anthropology before I turned to psychology.

Some people say situations are easier to change than people. If you accept the “broken-windows theory,” which your own early research supports, then fixing urban decay seems doable: you repair broken windows, you paint over graffiti—and street crime sharply declines. Prompt, detailed ounces of prevention provide big cures. But when you start talking about a system that creates bad situations, you’ve got a huge problem. Because aren’t systems really difficult to change? 

Systems are really difficult to change, but you can’t even conceptualize what a change would look like until you realize the system is where the power is—and you begin to investigate where there’s leverage.

One glowing example is with South African apartheid. This was going on for 100 years, and how did Nelson Mandela and his colleagues change that from the prison? It took them 25 years, yet they did it. It started by changing the guards’ perception of them, sending out messages to the community about acting with dignity, and then getting other countries involved, with American colleges saying, “We’re not going to invest in South Africa.”

The same thing with the war on terrorism: the only really effective thing they’ve done is freeze the assets of groups that support terrorism.

You suggest looking at the highest levels of power, the “barrel makers” who create systems and situations. But aren’t those people subject to system forces, too? 

Yes, the person at the highest level is in the context of getting elected. Today you have lobbyists from Israel saying, “Why is [Nancy] Pelosi talking about ending the war? We’re going to be more vulnerable.” And they’re threatening to cut off financial support. At that point the politician has to say, “I have to get money from somewhere else, and if I don’t get enough I can’t get re-elected, in which case I can’t work on other issues.”

The ultimate power is the power to frame the issues, to say, “This is a war against terrorism.” Is there anybody who’s against national security? But then you fall into Erich Fromm’s Escape From Freedom analysis, which is that the only way authorities can reasonably talk about guaranteeing your security is if you surrender your freedoms. But when you give up your freedom, that’s always real, whereas security is an illusion.

In the book you repeatedly say that you’re not practicing “excusiology” for the abusers, that understanding the why of what was done does not excuse what was done. Can you have it both ways—blaming the situation and the system while still holding individuals accountable? Is the solution simply a lighter penalty because of mitigating circumstances?

Legally, individuals are always accountable for their actions and found guilty if they break laws, civil or military. And situational forces should be invoked to mitigate the extent of the sentence of guilt. Currently there’s insufficient appreciation or discounting of how powerful they can be, and the extent to which they can play a major role in causing the illegal, immoral behavior.

We have to more fully appreciate the extent to which human behavior can come under the control of a host of situational forces in certain behavioral settings. As those forces become more extreme and intense, a greater percentage of ordinary, even good, people will be swayed, seduced, initiated into doing things that are unimaginable to them when they are outside the constraints of that situation. Not everyone is susceptible to those forces, only the majority of people, but that is a big number.

How does heroism, which you’ve begun studying, fit into this situationist’s model? How do you account for that one person out of 100 who does the right thing?

At this point we don’t know. But situations can be subtle. A professor gave a talk here a few years ago. He’s going on and on, and finally somebody asks a question that he can’t answer, and he says, “I’m really feeling sick and I’ll get to that in a minute.” I think, is he saying he’s sick of this kind of question? Suddenly he’s speaking much slower. . . . I move closer and closer until I’m right in front of him . . . and I notice his pants are now wet, so I say something like, “Maybe you should end.” And at that point he falls on me, and had I not been there, he’d have smashed into that chair. We called the paramedics, and it turned out he had the flu. So in a way it’s a heroic act because if I do nothing, nobody knows but me, but on the other hand, suppose I made a mistake?

You’re weighing the costs of taking action . . .

Essentially, it’s shame and guilt: you have to live with the guilt of not doing what you should have done vs. the shame of doing the wrong thing. All my life I’ve done things to make people laugh at me, and playing the fool means when the time comes I don’t care if people laugh. Also, there’s the situational thing: it’s only because I was sitting in the front row that I knew what was happening and stepped in.

So we don’t really know that everybody is in the same situation just because they’re in the same room.

Exactly. It’s the same with the prison study: some guards were on a shift where they didn’t see most of the bad stuff, because they were out getting the breakfast or lunch for the prisoners, and most of the bad stuff occurred on the night shift.

Why, throughout the book, do you use the word “evil,” which is such a loaded word? Why not just talk about aggression, since that’s a straightforward psychological term?

I’ve done lots of work on aggression, but aggression for psychologists has always been one-on-one. Once you have torture, where it’s a systematic program to instill fear in a community, to use specialized tactics to get information to break people’s will, that’s not aggression. That’s not even violence.

For me, evil is the highest level of inhumanity. It could be one-on-one, like the torturer and his victim, but more often than not at that level it’s the individual as an agent of a system.

One of the things I try to get across is, it’s really noble ideologies that allow the worst possible destructions, because you could always say, “I did it for God.” Throughout the world, evil occurs almost always in the name of religion or of national security. In the beginning of Mein Kampf, Hitler says, “In dealing with the Jewish question, I’m doing the Lord’s work.” No evildoer ever believes he or she is doing evil.

That gets to Hannah Arendt’s idea of “the banality of evil,” but you take it in a new direction.

Before Eichmann went to Auschwitz, he’s normal. The psychiatrist evaluating him says, “He’s more normal than I am.” So with the banality of evil, Hannah Arendt is saying that the evildoer looks just like us. But what she should have added is, that it’s only when the evildoer is in a special situation that he’s transformed. It’s only when he has the ideology, when his mission is to efficiently destroy as many of these people as possible . . . Eichmann’s job was to get 100 people to kill 2 million, and he did that really well.

So there are people like Eichmann who are situationally evil. Just like there are people who are only shy on blind dates—and when you ask them, “Are you shy?” they’ll say no.

The same is true of heroism—there are people whose whole lives are organized around service to others. Those are the rare exceptions among heroes in the same way the chronically evil people are rare. They stand out in our minds because they’re rare. So here are the chronically evil and the chronically heroic, situationally evil and situationally heroic—the everyday heroes, who in a particular situation, with no prior history of doing it, move from passivity to action.

The following interview questions and answers did not appear in the print edition of Stanford.

You outlined some strategies for resisting situational forces, and I’d like to hear you connect them to Abu Ghraib. What could Chip Frederick have done that he hadn’t already done?

That’s a really good question. At this point, I’m not sure what he could have done. He complained to senior officers that there were no rules of engagement, a lot of improper policies, patients with mental illness mixed in with other prisoners, and so on; and they just told him, it’s wartime and you have to deal with it. To be frank, Army reservists are the lowest form of life in the military, and he’s an Army reservist in a dungeon, in a horrible prison, in a horrible war, and he’s in a position of little power to change anything.

Usually you have to go to the next higher level. After the My Lai massacre, Ron Ridenhour [a young enlisted man who heard about the incident] insists on an investigation; he goes to the military officers, and they disregard him. So he goes to the next level: he writes his congressman, and nothing happens. It was only after he got [investigative journalist] Seymour Hersh involved, who was outside the system, that anything happened. While Ridenhour was in Vietnam, he couldn’t do anything. So the bottom line is sometimes when you’re trapped in a situation and you have little personal power it’s very difficult to change it.

This term “the Lucifer Effect”—did you coin it so people would stop calling it the Zimbardo Effect?

The Zimbardo Effect was a journalist’s shorthand for explaining abuses in some prison, to say that it’s not that the guards are rednecks. But the Lucifer Effect is a much broader concept.

When I was a kid, my mother always made me take my brothers and sisters to church, Sunday school, Tuesday catechism. (My parents never went to church. It wasn’t until later that I realized that the only reason they did this was we lived in this tiny apartment where there was no privacy, and it was the only time my parents could have sex.) As a kid I wondered, “How could God’s favorite angel become the Devil?”

When God created Adam as his perfect creature, he said all the angels have to honor Adam. And Lucifer refused to do that, and God took that as an act of disobedience, and also a sin of envy and jealousy, and sent Michael the archangel to punish him, and a bunch of other angels sided with Lucifer. Paradoxically God created Hell as a place to put them. I thought that was a bad deal—why was there no consultation? And over time I came to believe that Lucifer was actually right—because why should angels bow down to Adam, a mortal, if he can be so easily corrupted?

Trending Stories

  1. Let It Glow

    Advice & Insights

  2. Meet Ryan Agarwal

    Student Life

  3. Neurosurgeon Who Walked Out on Sexism

    Women

  4. Art and Soul

    School of Humanities & Sciences

  5. How to Joke in a Job Search

    Career Development

You May Also Like

© Stanford University. Stanford, California 94305.