STOCK MARKET UPDATE

Ticker

6/recent/ticker-posts

Why ordinary people become enablers of war

 

As the world scrambles to make sense of why Russia attacked Ukraine, we take a look at psychological theories that might give an insight into why people act the way they do.

Over the past few days, Russian President Vladimir Putin has been sending shockwaves through the world, from a full-scale invasion of Ukraine to putting nuclear forces on special alert.

His actions have been decried by millions, including Russians who have been getting arrested by the thousands for taking part in anti-war protests.

The developments stir up age-old questions. What is it like inside the mind of incredibly powerful people like Putin? And why do many people continue to follow him in the face of massive backlash, also having witnessed the devastation caused by the invasion?

While both questions demand complex answers, we might never have a full, comprehensive answer.

Here’s a look at what psychology has to say about why people commit atrocities under powerful rulers, and why it is hard to break out of such policies in the political decision-making sphere.

Obedience in lower ranks of military

In 1963, Yale psychologist Stanley Milgram conducted a groundbreaking experiment in an effort to explore how ordinary German citizens took part in the Holocaust, by investigating how far people would be willing to go while following orders.

His experiment was simple. Participants would be brought in, thinking that they were part of a study on how punishment affects learning. They were assigned the “teacher” role, while another participant, who was actually an associate of the researcher, was assigned the role of “learner”.

The learner was then tied to a chair and had electrodes attached to their arms. The participant (teacher) was made to believe the electrodes were controlled by a shock generator in front of them which, of course, was not the case.

The teacher was meant to teach the learner word pairs and punish them with an electric shock when they made a mistake. The shocks rose incrementally from 15 volts to 450 volts—with a 15-volt increase for each new mistake—and the learner acted accordingly.

After a few mistakes, the shock level reached 75 volts, upon which the researcher’s associate began to act out increasing levels of pain from the shocks. The whole time, the participant was in the room with the researcher and believed they were actually administering shocks to the learner.

As shock levels and the painful reaction from the learner increased, the participants turned to look at the researcher for guidance, and even objected to continuing the experiment, but were always met with the same calm face and definitive command—to continue.

The labels on the shock generator escalated from “slight shock” to “danger: severe shock”, and ended with the worrying “XXX” label, indicating the person would die. But with each shock, and each increasingly painful protest from the learner, the bulk of participants continued to obey orders from the researcher.

At the end of the experiment, Milgram found that a staggering 62.5 percent  of his participants had delivered the 450-volt shock, with 80 percent persisting even after his associate yelled out, “Let me out of here! My heart’s bothering me. Let me out of here! ... Get me out of here! I’ve had enough. I won’t be in the experiment anymore.”

A crucial point to note here is that the participants were actively expressing disagreement and objecting to continuing the experiment after witnessing that the learner was in severe pain—they did not morally agree with what the experimenter was ordering them to do.

The experiment caused extreme distress on the participants, bringing people to the edge of a nervous breakdown within minutes. But even that did not stop the majority of them from obeying the authority figure.

The experiments led psychologists to the conclusion that participants, when placed in a distressing and complex environment, relied on the researcher for guidance as they were the authority figure—the person in charge surely had to know better.

Some also took into account that participants would have feared backlash from the authority in case of disobedience, or lose a sense of personal responsibility and get into a mentality of “I am just doing what I am told to do, I am only obeying orders.”

Psychologists suggest that under extreme social influence, led by a powerful and assertive authority figure, obedience from the public could easily go through the ceiling, especially during times of crisis and when they are surrounded by others who are also obedient.

This is the case for soldiers on the ground. It is not necessarily the case that they morally agree with what they are told to do. They are operating under command in an extremely distressing and complex environment, and they look to others for guidance—which constantly justifies that the right path is through violence.

Compliance of decision-makers

If people on the ground are just following orders, what about the decision-makers themselves? Why don’t they bring a stop to Russia’s controversial advances in Ukraine, even when the nuclear threat has been put forward?

Psychologists have explained compliance in high-level decision-making spheres with a phenomenon called “groupthink”, which Yale psychologist Irving Janis used to explain the Bay of Pigs invasion and the Cuban Missile Crisis.

The idea is that a highly cohesive group, led by a powerful, assertive leader who makes his intentions clear, is likely to exhibit faulty decision-making as individuals in the group will prioritise in-group cohesiveness over making sure the group produces the optimal results.

This leads people to censor themselves, both in an effort to avoid rocking the boat and the personal repercussions they would face from expressing disagreement. Thus, an illusion of unanimity is created, leading the group into further commitment to their policies. 

Any spark of disagreement is suffocated by the pressure to comply. Moreover, some actors in the group that agree with the faulty decision, called “mindguards”, can further justify the decision in the eyes of the group by persistent lobbying.

Janis argued that this was precisely the case for Kennedy and his group of advisers—leading to one of the biggest foreign policy failures in the history of the United States. For example, one of Kennedy’s advisors, Arthur Schlesinger, reported self-censoring despite having severe doubts about the operation.

Considering the weight of President Putin’s leadership, and accounts of his relationship with his decision-making sphere—such as his recent interaction with the country’s spy chief Sergey Naryshkin—show that he is a leader who places considerable pressure on Russian bureaucrats.

Putin’s leadership indeed appears to be guiding his decision makers into compliance—preparing  fruitful grounds for groupthink to take place, leading Russia deeper into its military campaign in Ukraine—and perhaps even toward a nuclear escalation that bears similarities to the Cuban Missile Crisis.


Source: TRT World 

Social media is bold.


Social media is young.

Social media raises questions.

 Social media is not satisfied with an answer.

Social media looks at the big picture.

 Social media is interested in every detail.

social media is curious.

 Social media is free.

Social media is irreplaceable.

But never irrelevant.

Social media is you.

(With input from news agency language)

 If you like this story, share it with a friend!  


We are a non-profit organization. Help us financially to keep our journalism free from government and corporate pressure

Post a Comment

0 Comments

Custom Real-Time Chart Widget

'; (function() { var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true; dsq.src = '//' + disqus_shortname + '.disqus.com/embed.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq); })();

market stocks NSC