Cognitive dissonance and Egyptian politics

In Egypt, one gets extremely confused on a daily basis, whether you are watching the catastrophically biased news coverage by TV channels, or even if you are discussing the same events with “intellectuals” or with the “Man of the street”. One gets extremely confused by the incredible amount of illogical nonsense people keep saying. By illogical, I do not mean “irrational”, as in a normative judgment, but rather as “not logical”, in the sense that different elements or components of an assertion “do not add up”. We face a constant denial of facts, whether it’s denying violent behavior by Muslim Brotherhood members or supporters, or, more obviously, the full-blown come back of the security apparatus and its brutal practices.

 

Let’s change the context: In 1956, Leon Festinger and his colleagues published a quite interesting study, which went on to become a classic of social psychology. The book, entitled When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World, tackled an odd phenomenon that Festinger and his colleagues baptized “cognitive dissonance”. This concept describes a situation or a state in which individuals face a dire and brutal contradiction between a set of beliefs and facts. Wikipedia defines it as “the discomfort experienced when simultaneously holding two or more conflicting cognitions: ideas, beliefs, values or emotional reactions.” Festinger and his team went on to try and analyze how people reacted and how they tried to cope when they were in states of “cognitive dissonance”.

 

At first, Festinger was interested in analyzing this phenomenon using a historical investigation. Throughout the centuries, groups of people, cults, predicted the end of the world (millenarists among others). Sometimes, they even predicted when it would end, giving out very precise dates. But then, the world kept on going. How did these people react when the prophecy failed? How did they cope? That was Festinger’s initial interrogation.

 

During his work, he was lucky enough to stumble upon a story in a newspaper where a certain Susan Keech claimed to be in contact with aliens and predicted the imminent end of the world. A cult was quickly created around her and many people were drawn to it. Festinger’s team was then able to infiltrate the cult and observe it from within. On the fatidic day, when the aliens were supposed to come and get the members of the cult in order to save them from the annihilation of mankind, no one came…

 

Now from a strictly “logical” point of view, one would suppose that these people, who were not “mentally deranged”, would reevaluate their behavior in light of this obvious negation of their beliefs. Nevertheless, what Festinger’s team came to witness was quite dazzling. Members of the cult became even more convinced of their “cause” after the prophecy failed. Even more interestingly, some members who had been doubtful about the whole thing became completely convinced and devoted afterwards.

 

These people weren’t “crazy”. A few of them were even highly educated. The research team had to look for answers elsewhere, not in the people’s psyche but rather in their social environment. A few sociological hypotheses (rather than psychological) arose from their observations. Let me suggest two remarks based on the study’s argumentation:

 

1)   Some of the members of the cult had sold all their properties and belongings, left their children and spouses, left their jobs, in order to be “taken away” by aliens. We can imagine that such an action is extremely costly. It is hard to foresee how a person in that context would be able to go back to his normal life afterwards. To put it in simpler words, he had gone too far to be able to come back. This is what we call a “ratchet effect”. Moreover, that person wasn’t alone; other people had done the same, and as Festinger later put it “If more and more people can be persuaded that the system of belief is correct, then clearly it must after all be correct”.

2)   Most of the members moved in with Susan Keech. The community life made it easier on everyone to “believe” by not having to face dissonant point of views. It appeared in the research that the first people to defect, even if they used to be ardent believers in the beginning, were those who were implicated in other social circles (for instance, who didn’t move in with Mrs. Keech, who didn’t leave their jobs, who had a family outside, etc.)

 

This study thus suggested how it became difficult for groups to disengage from actions or ideas that they were entrenched in, and how they seemed to be in a state of denial. It also suggests how dangerous it is to isolate a given group even more or to try to “convince” its members of the inexactitude of their beliefs through violence.

 

That being said, evidently, any resemblance with current Egyptian politics is purely coincidental. Image

%d bloggers like this: