Audio tapes reveal mass rule-breaking in Milgram's obedience experiments (psypost.org)

by lentoutcry 133 comments 227 points
Read article View on HN

133 comments

[−] sarchertech 45d ago
If you read Gina Perry’s critique, her conclusions is that fewer than half of the participants thought it was real.

These were Yale students, so probably smarter than average, and the study didn’t do a very convincing job make it seem believable from what I’ve read.

When I took psychology in college I had to submit to random experiments to as part of my grade (there were alternatives but the experiments were easier). Before I’d ever heard of Milgram, if one of those studies had put me in a similar situation I would have smelled a rat immediately.

When I was in middle school the teachers created a fake “government decree” to convince us that there was a new sin tax on products kids use (as a simulation). I immediately knew it was fake as did many other students, but that didn’t stop us from playing along for fun. I talked to a few of my teachers later and they genuinely believed that we fell for it.

[−] seanw444 45d ago
That's pretty fun that your teachers did that. I wish teachers attempted to immerse students in the things they're teaching about more often, rather than just reading about it in abstract through a textbook or whatever.
[−] joe_the_user 45d ago
I had a Junior High School teacher who did a variety of immersion lessons. The problem was even a small deviation from the real world structure turns the exercise into a pretty simple game. Essentially, the results are too complex and muddy to extract overall lesson.

And social science/history/economics is about learning the standard lessons of the field (even if those lessons are themselves simplistic compared to the real world, they are a baseline of common knowledge).

[−] acjohnson55 44d ago
I did one of these experiments around 2011, and because it was so obvious that the experiment was contrived, there was a lot of misdirection around the actual experiment, which was testing something totally different from the pretense. Like different responses to font color or something like that.
[−] trc001 42d ago

>These were Yale students, so probably smarter than average

In my experience, Ivy League students are some of the most profoundly stupid people I've ever met

[−] janalsncm 45d ago

> Interviewing the original participants―many of whom remain haunted to this day about what they did―and delving deep into Milgram's personal archive, she pieces together a more complex picture and much more troubling picture of these experiments than was originally presented by Milgram.

Just reading the Amazon summary, I feel like there’s a contradiction. If subjects were just trying to get it over with, yes it invalidates the study but the only troubling conclusion is that the study wasn’t scrutinized more closely.

I also don’t see why they would be “haunted” by what effectively amounts to a chore to get their $20 participation check.

[−] sarchertech 45d ago
She never said that none of the participants believed it was real.
[−] Mordisquitos 46d ago
Interesting. If we can assume the experimenter's failure to enforce the rules was mere clumsiness or incompetence, rather than an indicator of underlying intentional manipulation of the experimental conditions à la Stanford prison experiment, this can be interpreted in many different ways.

The (eventually) disobedient subjects were better at respecting the experimental process they were given than the "obedient" ones who went all the way to the maximum voltage. Why was that?

Could it be a sign that the disobedient subjects were on average more concentrated on the task at hand (smarter? less stressed? better educated? more conscientious?) than the ultimately obedient ones, and therefore were more likely to realise they were "hurting" the alleged learner and stop?

Or could it be that the obedient subjects were more likely to realise there was something fishy going on, suspecting the "learner" wasn't really being shocked, and thus were paying less attention to the learning rules?

Or was it, as the article suggests, that the obedient ones may have shut down emotionally under pressure to follow through, and their mistakes are the result of that?

Or were the obedient ones more likely to be actual sadists, who were enjoying the shocks so much that they didn't even care if the "learner" didn't hear their question, giving them a greater chance of shocking them again?

Unfortunately I think the Milgram experiment has become so entrenched in popular culture that there's absolutely no way it can be properly repeated to explore these questions.

[−] user2722 46d ago
It really calls into questions the conclusions drawn from the last 50 years. Here's the ones disproven I remember:

* kids grow to be rich because they accept delayed gratification

* alpha males are the leader of the pack and all other males are useless

* people accept violence if there is a higher authority which justifies it with a reason

How many people suffered or delivered suffering because of their beliefs in the above?

[−] Intralexical 45d ago
Making someone think they're an accomplice to torture is itself recognized as a form of psychological torture. Telling someone that they're helping to advance science proves nothing, except that people can be deceived, manipulated, and exploited by bad actors.

Milgram decided to repeat his gross ethical violation 30 times(!), with dozens of test subjects each time. Overall, the majority of people actually disobeyed the orders to continue with higher voltages.

I think the only reason it's become so popular is because it makes for a shocking story, with grandiose implications. The specific "agentic state theory" Milgram invented is not backed up by his data, and personally, I find it philosophically dubious and psychologically concerning that he gravitated to it.

See:

https://www.bps.org.uk/psychologist/why-almost-everything-yo...

https://journals.sagepub.com/doi/abs/10.1177/095935431560539...

[−] kakacik 45d ago
The first point, and I can see in my own life, is valid. Not properly rich by any means, but vastly surpassed any expectations and most of my peers from earlier life (which is rather easy when coming from poor eastern Europe but somehow most folks from back home didn't, too deep in their little comfort zones or fears of risks that were mostly made up).

It can be reframed as cca discipline too, willingness to suffer a bit for later rewards. Can see this as massive success multiplier in many real world situations.

[−] burningChrome 45d ago

>> willingness to suffer a bit for later rewards.

Almost every person I went to college with had this viewpoint. There's also something comforting knowing you and your friends are all doing the same thing. We all were dirt poor in college trying to support ourselves with crappy part-time jobs working delivering pizza, working in fast food joints, cleaning offices at night. The idea was we all believed we were working towards something better than our current situation. The suffering some how made you a better person, more resilient, made you understand what it was like to really earn something.

All of my close friends I had in college all went on to do successful things. Engineers, attorneys, stock brokers, software engineers, pharmacists. We all eventually got to where we wanted to be, but the suffering is what still binds us together to this day. Talking about some of the houses we lived in that should've been condemned. Having to work 60 hours a week, and still do well on that exam on Friday.

The willingness to suffer is eased when you have a shared experience with others around you.

[−] wredcoll 45d ago
The great thing is you can just focus on the one person who "worked hard" or "self disciplined" or "studied well" and got rich while ignoring all the other people who did the same thing and didn't.
[−] kakacik 45d ago
Working blindly hard is rarely rewarded well. Working smart, much better success story. This can be applied across whole job market but also within white collar jobs - I saw folks around me almost burn out with little to no reward, when it was cca clear it would end up that way. I didn't at that point, and leaned into stuff in other areas of my life and that worked much better.

I only write about myself and my perspective, have nothing to sell here, just sharing experience. No need to be so dismissive. There is always a factor of luck, but much less so if that approach spans across decades and generally works for me.

[−] KSteffensen 45d ago
Didn't the Dunedin study also find that childhood self-control and delayed gratification correlated with adult life outcomes?

https://dunedinstudy.otago.ac.nz/files/1571970023782.pdf

[−] wredcoll 45d ago
Last I checked, the delayed gratification was also highly correlates with having wealthy parents.
[−] nolist_policy 45d ago
Source?
[−] arethuza 46d ago
On that second point - I can strongly recommend the book Goliath's Curse by Luke Kemp:

https://en.wikipedia.org/wiki/Goliath%27s_Curse

[−] cucumber3732842 45d ago
Wikipedia makes it sound like questionable at best. I'll wait a decade and see if it comes out looking like milk or wine.
[−] joe_the_user 45d ago
I don't think experimental psychology ever validated those extremely simplistic conclusions. I'd rather these simplistic conclusions are a "folk summary"/mythical-version of a few experiments and they come from already existing cultural tropes, cultural tropes that were simplified and made more cruel and ruthless by various self-marketing consultants.
[−] Spooky23 45d ago
Alot of the problem with these “disproven” things is over broad scope or abused in the popular media beyond comprehension.

The delayed gratification thing in particular is correlation vs. causation. It was really more about trust. Forcing kids to delay gratification is meaningless or counterproductive.

[−] user2722 45d ago
Agree. But according to Gemini [for what's worth] the final 1990 Mashmallow's study [since first versions were cautious] did indeed jump to conclusions to point there was a causation to a better later life. The media might have amplified, but the wrong (or misleading) conclusion was already present in the _scientific_ paper.
[−] mrguyorama 45d ago
If a scientific paper makes a conclusion, that doesn't mean its a correct, valid, or properly supported conclusion.

You instead look at the claim and the data and the experiment methodology. It often says something far far less generalizable or significant than the conclusion section of the paper.

[−] watwut 45d ago
The thing about experimental science is that you should not make much conclusions from one study or one paper. Those should wait till consensus is reached, till there are many independent studies confirming the same thing under various conditions.
[−] d1sxeyes 45d ago
The Milgram experiment also couldn't be repeated today as it was completely unethical. It caused huge psychological distress to participants to the point that some participants had seizures.
[−] Mordisquitos 45d ago
Maybe we can do a meta-Milgram. A group of junior researchers are tasked with implementing what they believe to be a Milgram experiment, and while performing it the subjects (actually actors) start faking psychological distress in response to having to shock the completely fake learner subjects.

One of the researchers feels guilty from the apparent panic attack his subject appears to be going through, so he excuses himself from the experimental room and approaches the lead investigator who's watching on CCTV from outside:

“Professor, this subject is really suffering from their belief that they are electrocuting the learner. I believe this is unethical, can we stop please?”

The professor replies:

“The experiment requires that you continue.”

[−] Lerc 46d ago
My guess is that it is the pressure to conform working in multiple ways.

The reading of questions while the subject was screaming is acting in a way that seems like that it is a performative action of conforming to the pattern and that the failure of the pattern is caused by the answerer failing to conform to the pattern. That makes the shocks a punishment for failing to conform. The questioner has a facade of doing the right thing by going through the motions, even though they are breaking the rules by doing so, because if the other party were compliant that rule wouldn't have been broken. That the shocks were painful would feel appropriate to those who had a strong sense that nonconformity should be punished. It is less them following the rules and more them assuming the intent of the rules and permitting abuse because the intent was not their decision. It might make them less willing participants to the abuse and more 'not my problem' active participants.

[−] joe_the_user 45d ago
The reason you have psychology experiments with controls and parameters is that extracting definite conclusions from the simple observation of human behavior is extremely difficult given the wide variety of individuals, groups and cultures.

Once you have an experiment that degenerates into just an event, a situation where the controls have failed, you come up with many potential conclusions but you've lost any science-specific-conclusion to the observations and you may as well look any series of events.

That said, I think experimental psychology just generally fails to establish enough controls to merit the scientific quality it aspires to.

[−] bambax 46d ago

>

By staying silent and letting the memory study fall apart, the experimenter allowed an atmosphere of illegitimate violence to flourish.

Many people are cruel. Not all people, maybe; not most people, also maybe; but some people enjoy hurting others. We see this everywhere. Isn't it possible that this kind of profile jumped on the occasion to inflict pain on people with no fear of repercussions?

In other words, isn't this study just a sort filter to triage / order students from most cruel to less cruel?

[−] raylad 45d ago
One question is whether the participants really believed they were giving shocks to the "learners".

In college I participated in a number of psychological studies that were similarly deceptive, where one of the other participants was obviously (to me) an actor, or, sometimes, a pre-recorded video.

At least one of the studies I participated in was quite like the Milgram study described in the article, where I was supposed to punish another participant. It was very obvious to me that this wasn't really happening, so I randomly punished or didn't punish them, and then afterwards told the researcher that I knew it was all fake.

I think many or most other people who saw through the deception probably wouldn't have let the researcher know, because it seemed somehow disrepectful to tell him.

No idea if he used my results or, as he should have done, discarded them.

[−] xeyownt 46d ago
Without study of the internal motivations, the conclusions of the study are pure conjectures.

You are trapped in an experiment and you have the impression that things went too far and you think you can't escape? You rush it. You hear horrible noises? You just pretend you don't hear them. These are all classical mental patterns. There are million ways to explain them.

[−] delis-thumbs-7e 45d ago
This study is so flawed in so many ways that it doesn’t prove or disprove anything in any way. The most obvious thing is that the assumption that the test subjects did not realise it was fake. It was not controlled in any way and many of the subjects (presumably Yale students and so hardly complete dumb-dumbs) propably thought it was just a lark.
[−] dlev_pika 45d ago

> With the scientific elements either forgotten or rushed, the laboratory changed into a setting for unauthorized and senseless violence.

> The study authors propose that the experimenter played a major, passive role in establishing this dynamic. When the participants broke the rules and skipped steps, the authority figure rarely intervened to correct them or pause the session. By staying silent and letting the memory study fall apart, the experimenter allowed an atmosphere of illegitimate violence to flourish.

This sounds like looting scenarios to me. ie. When a situation descend into chaos, some people will just surf/leverage that chaos, instead of attempting a return to normalcy, for whatever reason.

[−] watwut 46d ago
This one is actually interesting: The statistical difference highlights that the people who eventually quit were actually better at following the scientific protocol than those who went to the end.

And also this: The most frequent violation in obedient sessions (those who shocked till the end) involved reading the memory test questions over the simulated screams of the learner. Doing this effectively guaranteed that the learner would fail the test and receive another shock.

Basically, being willing to shock other people without stopping was more about violence itself being permitted then about being obedient person. Rule followers followed the protocol until they concluded "nope, this is too much" and stopped mistreating the victim.

[−] animalfarm 45d ago
This is well-documented in Humankind: A Hopeful History by Rutger Bregman. It's worth a read, as it also dispels other experiments in human behaviour that have subsequently been difficult to replicate (for a variety of reasons).

https://www.goodreads.com/book/show/52879286-humankind

[−] janalsncm 45d ago
This seems like something that people would very quickly discover if the data were freely available to listen to. As of now we need to take this study’s authors at their word that the metadata is accurate, and before that we had to take Milgram at his word that his interpretation was accurate.

Apparently if you want to get access to the raw audio, you need to ask Yale. Why?

[−] skrebbel 45d ago
I wonder what percentage of "obedient" teachers saw through the facade, realized that the learner wasn't a very good actor, and was just having a good time playing along with what must've seemed like some psychology professor's weird pain kink.
[−] jordwest 45d ago
My hypothesis: we are social creatures and have an innate instinct not to hurt others, but we’ve been trained to various degrees (through upbringing/trauma/school/work) to disassociate from the pain of hurting others.

The people who did continue to administer shocks were attempting to focus on what they thought was the most important part of the task (pressing the lever), but internally the unconscious effort to habitually dissociate from their own discomfort led them to make more mistakes.

Combine this dissociation with a desire for power or status and you get the world we live in today.

[−] crazygringo 45d ago
To be clear, this doesn't seem like it invalidates anything in the original experiment.

The "rule-breaking" isn't referring to anything the researchers were doing.

It's referring to what the participants were doing. It points out that the compliant subjects who delivered the shocks weren't always following the procedure they were given perfectly. Which is, of course, expected, since people in general don't follow instructions 100% perfectly all the time, and especially not the first time they do something.

> Kaposi and Sumeghy interpret these patterns as a complete breakdown of the supposedly legitimate scientific environment. The subjects were not committing violence for the sake of an orderly memory study. With the scientific elements either forgotten or rushed, the laboratory changed into a setting for unauthorized and senseless violence.

This feels like a huge stretch. Forgetting a step at one point or reading something out loud too early isn't a "complete breakdown of the supposedly legitimate scientific environment" -- a "scientific environment" that is completely fictional to begin with.

[−] Intralexical 45d ago
It should have been rejected from the outset. What Milgram did in his experiments was nothing less than construct an elaborate setup so he could psychologically torture dozens of well-meaning people. The ethical violation was already recognized at the time, and given that, nothing else he claims about method or implications can be trusted.
[−] jdawg777 45d ago
It raises the point that if the results are questionable, why not just repeat the experiment?

Here is Derren Brown's attempt at repeating the experiment: https://www.youtube.com/watch?v=Xxq4QtK3j0Y

[−] yashasolutions 45d ago
That's an interesting perspective, and it does expand how we can interpret the Milgram experiment

That said the study has been replicated many times since the original, with researchers adjusting different parameters like participant screening, changing the gender balance, or varying the roles (teacher/student, researcher/technician...) Across these variations, the overall result stays quite consistent: under certain conditions, ordinary people can be led to do harmful things.

Other experiments have also looked at which factors make this more likely, and for example, diffusing responsibility seems to be one of the most effective ones.

[−] analog8374 45d ago
Appearance of rule-following is of primary importance, not actual rule-following.

The performance, or signal, or whatever we're calling it. That's the important thing.

[−] palata 46d ago
I have always been pretty critical about "psychology" as a field, but always kept famous successful experiments (like Milgram and the Stanford prison experiment) as examples that "sometimes it's possible to actually get interesting results".

Turns out those are not valid examples either. So I am genuinely wondering: what remains of the field of psychology, except for a group of people who find it interesting to think about how other people think/behave? Are there examples of actual, useful and valid conclusions coming from that field?

[−] convexly 45d ago
So the published narrative survived this long without anyone checking the tapes? Suspect.
[−] jdthedisciple 45d ago
always thought it seemed flawed
[−] geon 45d ago
Is there any information on how many of the participants realized the victim was just acting? Surely it can’t be zero.

https://en.wikipedia.org/wiki/Milgram_experiment

[−] renewiltord 45d ago
This isn't an experiment. It's just some idiot running pseudoscience. Predictably the pop science morons have decided this fake 'research' needs more attention than just dismissal.
[−] mikkupikku 45d ago
What they teach undergrads about the experiment: People blindly follow orders. If the Nazis ordered you to commit atrocities, you probably would!

What the experiment actually showed: People follow orders when the orders are justified within a persuasive ideological context, e.g. you value science and the scientific researcher is telling you to proceed for the sake of science.

In the first, people who follow the orders of Nazis are not necessarily ideologically aligned with the Nazis, they might just be in a brainless order-following trance. But this isn't real, and in reality the people who were "just following orders" were in fact ideological committed to the cause and should be judged accordingly.

[−] phendrenad2 45d ago
Milgram gets thrown around as proof that everyone is just a few steps away from being an agent of evil. Finding out that it actually shows that there are psychopaths among us, and most people actually refused (left the experiment), somehow "clicks" and fits with reality a lot better. We see this in historical genocides - not everyone is in on it, and in fact it has to be covered-up internally because only the psychopaths are able to stomach it.
[−] 9864247888754 45d ago
[dead]