Advertisement
X

The Origins Of Evil

When the fire threatens to raze us, there’s just one tune pervading our mind, body, and soul: that morality isn’t a choice as much as a privilege.

On June 18, 1961, a half-page ad appeared in a daily newspaper in New Haven, Connecticut. “We will pay you $4.00 for an hour of your time,” read its headline. Conducted at Yale University, the “scientific study of memory and learning” required 500 men. The ad’s bottom section contained a disclosure form, comprising an applicant’s personal and professional details, addressed to Professor Stanley Milgram. Hinged on a deceit—the study tested not “memory and learning” but obedience to authority—it sought to unravel the genocide’s psychology: “Could it be that [Adolf] Eichmann and his million accomplices were just following orders?” wondered Milgram. “Could we call them all accomplices?”

Inspired by the trial of Holocaust perpetrator Eichmann—who, according to historian and philosopher Hannah Arendt, didn’t resemble a vicious anti-Semite as much as a numb bureaucrat, someone embodying the “banality of evil”—Milgram began his experiment. It involved three participants: an Experimenter, a Teacher, and a Learner. The Experimenter controlled and administered the test. The Teacher, who had responded to the ad, delivered electric shocks to the Learner when he gave wrong answers to the word-pair questions. The Learner, however, an actor and a confederate sitting across a thin wall, received no shocks—his crying and pleading came from pre-recorded audio—a fact not known to the Teacher (the Subject). Every wrong answer demanded an incremental shock of 15 volts, ending in 450 volts, a lethal dose. As the jolts intensified, the Learner begged and sobbed and screamed. At 300 volts, he kicked the wall and stayed silent—forever.  

If the Teacher hesitated, the Experimenter provided sequential prods: a) “please continue”, b) “the experiment requires that you continue”, c) “it is absolutely essential that you continue”, and d) “you have no other choice, you must go on”. If the Teacher said no to all of them, then the Experimenter terminated the test.

When Milgram asked psychology students, his colleagues, and psychiatrists to predict the experiment’s result, they all agreed that a minor percentage of Teachers—ranging from 1.2 per cent to 3.7 per cent—would administer shocks exceeding 300 volts. During the test, many “showed signs of nervousness,” wrote Milgram in his 1963 paper, “sweating, trembling, stuttering, groaning, biting their lips, digging fingernails into their flesh”. Many broke into fits of nervous laughter. Some suffered from “full-blown” and “uncontrollable seizures”. The outcome shocked Milgram: all Teachers reached the 300-volt mark and a staggering 65 per cent finished the experiment, effecting the 450-volt shock. The final implication wrote itself: obedience to authority and actions divorced from responsibilities could cause murder.  

In August 1971, Stanford University professor Philip Zimbardo released an ad in a city newspaper in Palo Alto, inviting volunteers for a 14-day “psychological study of prison life”, paying them $15 a day. A coin flip divided 24 of them into two groups of prisoners and guards. He converted the psychology building’s basement into a mock prison, where the ‘inmates’ wore uncomfortable smocks (without underclothes), stocking caps, loose rubber sandals, and heavy chains on their right ankles—“bolted on”, as Zimbardo detailed in his paper, and “worn at all times”.  

Advertisement

The guards, not “given any specific instructions”, could “make up their own rules”. They donned “khaki uniforms”, hung “whistles around their necks”, carried “big billy clubs”, and wore “special sun-glasses”—“an idea borrowed from the movie 'Cool Hand Luke'”. The first day registered no violence but soon, the prisoners rebelled and the guards retaliated—with escalating psychological and physical abuse—to the extent that Zimbardo had to abort the experiment on the sixth day itself. “By the end of the study,” he wrote, “the guards had won total control of the prison, commanding the respect of each prisoner or, more accurately, their obedience.”

Zimbardo sought to unravel the following: How would a person behave if his actions had no consequences (akin to anonymity in a mob) and could certain environments endorse, or encourage, vicious tendencies in those who had not exhibited them before? The results of Milgram and Zimbardo, high school peers, reinforced each other, making them celebrities. Their experiments inspired books, plays, songs, documentaries, fictional films, a band’s name—and more. Someone else too, an even bigger star, shone above them, popularising and validating their findings, Arendt, whose work Milgram had cited in his 1963 paper.  

Advertisement

So far so neat: a problem, an experiment, a solution. But if we are using these tests to understand the roots of evil, as many have for decades, then it’d be fair to ask: What about the experiments themselves? What was really happening in them beyond the shocking conclusions?

Turns out, quite a lot.

Australian psychologist Gina Perry revered Milgram for a long time. But then she examined his personal accounts, the experiment’s data, and interviewed...his subjects. A “more complicated story” emerged, as a result, informing her book 'Behind the Shock Machine' (2012). Milgram had not conducted one experiment but “over 20”, comprising different variations, stories, and actors, totalling over 700 subjects. In one version, the “experimenter gave orders over the phone”; in the other, the teacher and the learner sat in the same room, and so on. Remember the fabled 65 per cent obedience rate? That came from one version, the first, with 40 subjects. “And in over half of all his variations,” she wrote, “Milgram found the opposite result—that more than 60 per cent of people disobeyed the experimenter’s orders.”

Advertisement
It’s often said that people contain multitudes. No doubt, but of what kinds? Because sometimes we can be switches: one flick, compassion; next flick, callous.

Besides several ethical problems—Milgram hadn’t debriefed his subjects properly—Perry found out that many teachers, after pushing the high-voltage switch, did express remorse. Just a month into the experiment, however, Milgram had likened them all to Nazis (in a letter to the National Science Foundation). After the experiment, a subject, Bill Menold, went to his neighbour and shared his concerns. “It didn’t make me feel very good,” he told Perry. “You know, the cruelty involved.” Herb Winer left the lab “furious”, worrying about the health of the learner. Bob Lee, another subject, told Perry he didn’t fully understand the experiment.   

Academic Don Mixon, who recreated the experiment, told Perry that “immorality” didn’t drive Milgram’s subjects as much as the “trust in the experimenter” who, time and again, told them to continue. Many subjects, then, perhaps persisted to serve good science. “According to Don,” wrote Perry, “Milgram simply measured the faith people put in experts”—that they’ll “go to great lengths, even suffer great distress, to be good.”

Advertisement

The Stanford Prison Experiment, too, largely remained untainted for decades—until researcher Thibault Le Texier revisited it via filmed footage, audio clips, and archival documents. He published his findings in a 2019 paper in the American Psychology Association, rubbishing Zimbardo’s pivotal claim that “neither the guards nor the prisoners received any specific training” and that their outbursts resulted from an organic oppressive environment. Because Zimbardo and his associates, revealed Texier, had trained the guards on how to “create a pathogenic environment”; “intervened to give precise instructions”; and, even, made the guards “believe they were his research assistants”.

And then, the testimonies. Here’s Guard 11: “Prof. Zimbardo directed me to act a certain way (ex. hard attitude Wednesday following Tuesday leniency).” Guard 4: “Throughout the experiment I was role playing.” Guard 1: “I was always acting.” Several prisoners echoed the same feeling, saying they “always knew it was an experiment”. Even the final data, added Texier, was “neither complete nor uniform”. Zimbardo and his assistants “collected no data” on the third day. The experiment ran for 150 hours. The amount of video footage publicly available? Six hours.

Over the last decade, Arendt’s work has also attracted cogent criticism via new evidence, Eichmann’s interviews—showing he was, indeed, a vicious anti-Semite who acted with full awareness and intention, taking pride in, and expressing no remorse about, his well-thought-out crimes—by philosopher Bettina Stangneth and historian Deborah Lipstadt in 'Eichmann Before Jerusalem' (2014) and 'The Eichmann Trial' (2011).   

Authority demanding obedience compelling harm may be tough to simulate in a lab, but that doesn’t mean our latent duality, cruelty substituting servility, ceases to exist.

So, what does it all mean? Do Milgram and Zimbardo have nothing useful to say about authority, obedience, or evil? A simple cross-check can resolve the problem: replicating the experiment across different conditions. Unlike the Stanford Prison Experiment, whose unethical methodology prohibits recreation, over a dozen versions of Milgram’s experiment have been successfully reproduced in many countries.  

Plus, why hide in a lab when you’ve life? In 1974, 28-year-old Serbian artist Marina Abramović stood on a stage in Naples, Italy, and placed 72 objects on a table: a feather, a rose, a metal bar, grapes, wine, scissors, a scalpel, a bullet, a pistol. “You can use them on me as desired,” she told her audience. “I’m the object. During this period [lasting for six hours], I take full responsibility.”

Someone offered her a rose; someone gave her a peck on the cheek; someone turned her around. And then, the night turned darker. In the third hour, “they cut my neck and drank my blood,” she recalled in an interview. “They carried me around, opened my legs, and put a knife between them. They took scissors and ripped my clothes.” Art critic Thomas McEvilley, who was present at the event, added, “In the fourth hour, the razor blades began to explore her skin, [followed by] minor sexual assaults. When a loaded gun was thrust to Marina’s head and her own finger was being worked around the trigger, a fight erupted between the audience factions.”

Now consider the 1966 Hofling hospital experiment where a ‘doctor’ called a nurse to administer 20 mg of a fictitious drug, “Astroten”, saying he’d sign the approval for the medication later. The approved list didn’t contain the said drug but the medicine cabinet had an Astroten bottle in it whose label specified the maximum dose: 10 mg. Psychiatrist Charles Hofling asked 12 nurses and 21 nursing students how many nurses would follow the order. They said two (6.66%). Hofling selected 22 nurses for the experiment: 21 of them complied (95.45%).    

There’s more. On August 4, 1994, a man claiming to be a sheriff’s deputy rang a McDonald’s in Saybrook Township, Ohio, and informed its manager, James Turcotte, that a customer’s purse had been stolen from the restaurant. He ordered Turcotte to call two minor female employees, Rhodes and Bratzel, individually, to his office. Assuming an authoritarian tone, the alleged deputy instructed Rhodes to allow Turcotte to strip-search her. She removed most of her clothes. Bratzel refused a strip-search but said yes to a ‘‘pat down’’ search. The diktat and fear of authority—not even present in the room—was so pronounced that one stranger could order another to molest his employees. This one incident was disturbing enough, but it happened again and again—across 30 states, over 70 phone calls, for 10 years—as if the Milgram experiment had a telephonic counterpart. In the subsequent years, the strip-search devolved to kissing, inappropriate touching, spanking, rape. Like the above experiments, this case elicited a similar baffling response: why would anyone do it?

And yet, they did.    

Maybe we all will, given the ‘right’’ circumstances, the right master, the right threat. Authority demanding obedience compelling harm may be tough to simulate in a lab—a loss of control, after all, doesn’t respond too well to a controlled environment—but that doesn’t mean our latent duality, cruelty substituting servility, ceases to exist. It’s often said that people contain multitudes. No doubt, but of what kinds? Because sometimes we can be switches: one flick, compassion; next flick, callous. It’s easy to toss a nonchalant response from the cushy cocoons of our current safety, but when the fire threatens to raze us, there’s just one tune pervading our mind, body, and soul: that morality isn’t a choice as much as a privilege.

Show comments
US