ISIS, Cults, and Religious Extremists: How Mind Control Really Works

image

The power of social influence is great — and sometimes dangerous. Would you know if someone or something else was controlling your mind? (Illustration: Erik Mace for Yahoo Health)

You know how the saying goes: With great power comes great responsibility. Unfortunately, psychology at its most powerful can be entirely stripped of responsibility.

Groups like ISIS don’t use just violence to get their messages across, they use psychological techniques to recruit and keep members. Cults and controversial religious groups gain followers and power by instilling lifestyles of fear and obedience — arguably rewiring people’s brains and manipulating their minds. Members, then, begin to act in ways unrecognizable to family and friends, leading some to wonder: Have they lost their minds?

But what is “mind control”? How does it work? And just how much are we influenced by those in our social spheres? Furthermore, is there hope for those who have fallen victim to this kind of psychological abuse?

The difference between healthy and unhealthy influence

Many experts argue that mind control and social influence — how much your emotions, behaviors, and even opinions are affected by other people — occurs on a continuum. It can range from good and healthy, like friendship, to negative and unhealthy, like imprisonment.

Healthy social influence respects individuality, free will, conscientiousness, honesty, integrity, and accountability, says Steve Hassan, one of the foremost experts on mind control and cults and author of Combating Cult Mind Control: The #1 Best-Selling Guide to Protection, Rescue, and Recovery from Destructive Cults: 25th Anniversary Edition. With a healthy parent-child influence, for example, the parent is influencing the child to be his- or herself and grow up to be a good adult, he says.

But on the negative side, influence is destructive. It becomes “about control and obedience, cloning people in the image of the group, not encouraging individuality or creativity, regulating what people read or who they can associate with, and the installation of phobias,” he tells Yahoo Health.

image

Steve Hassan is an expert on cults — and a survivor of one. (Photo: Steve Hassan/Twitter)

How can you tell the difference? “Ethical groups tell you up front what they want and who they are,” says Hassan. There is what he calls “informed consent” among members. But with cults and groups that practice mind control, there’s “a lot of deception, a lot of lies, and people don’t know what they’re getting into.”

What is a cult, in the first place? Hassan says there are a “million definitions, from theological to sociological. I define a destructive cult as an authoritarian pyramid-structured group that uses deception in recruitments and mind control to keep people dependent and obedient.” Of course, there are benign cults too — people who are into rock stars or musicians, for example. And cults aren’t always religious. For example, Hassan calls the Islamic State of Iraq and ash-Sham (better known as ISIS) a political cult that happens to use religion. And cults can be all sizes — one-on-one or a state with millions of people. Many are listed here.

Most cult leaders, he adds, believe what they are preaching — which makes them more dangerous. The vast majority of leaders are narcissistic, probably personality disordered, and have some antisocial characteristics, he adds.

Related: Are We Born Narcissists — or Is It Someone Else’s Fault?

Hassan would know. At 19, he was recruited into the Unification Church of the United States, eventually growing into a leadership position within the cult and breaking away after two and a half years. In a nutshell, the church is a destructive cult whose position is that founder “Sun Myung Moon was the new Messiah and that his mission was to establish a new ‘kingdom’ on Earth,” Hassan writes in his book.

“I wasn’t looking to change religions when I was recruited,” says Hassan. “I was situationally vulnerable. I was a junior in college, my girlfriend had just dumped me, and one day, three women approached me and we started chatting. I thought they were interested in me. They didn’t tell me they were celibate. I had no idea that they were part of a cult. If they had told me what they believed, I wouldn’t have had them sit down.”

Mind-control tactics explained

After that initial conversation, the women Hassan had met invited him on a weekend getaway. “I worked on weekends but happened to get that weekend off, so I thought, Am I supposed to go to this?” he remembers. Hassan attended what turned out to be a textbook recruitment weekend for the group. He was isolated, deprived of sleep, had no privacy, and had hypnotic techniques practiced on him. “Day by day, they wore me down and put ideas in my head — like that World War III was about to happen between the Soviet Union and the U.S. I wasn’t religious, but within a day, it was all about God. When I said, ‘I’m Jewish — I’m not interested in Christianity,’ they did the classic technique of trying to make me feel guilty for being close-minded. Within two weeks, they had their hooks in me. I was made a leader in the cult. I changed into a stranger.”

What Hassan knows now is that that weekend, he fell victim to the initial stages of mind control — something that, years later, he would become an expert counselor in.

Defined by Philip Zimbardo, PhD, professor emeritus at Stanford University and former president of the American Psychological Association, mind control is:

The process by which individual or collective freedom of choice and action is compromised by agents or agencies that modify or distort perception, motivation, affect, cognition and/or behavioral outcomes. It is neither magical nor mystical, but a process that involves a set of basic social psychological principles.

But don’t non-cult religions and groups also have influence over us — and what we think and believe? Hassan says that if religious indoctrination is respecting people’s free will, is love-based, and allows people to leave if they want to, then it’s on the benign or constructive side of the influence spectrum.

The process of mind control includes a slew of steps, including isolating people, interrupting their information flow, doing an information overload, throwing off their balance, or creating mystical experiences. All of these, and more, are part of a set of criteria developed by renowned psychiatrist Robert J. Lifton that must be met in order for one to be mind-controlled.

Hassan developed his own model, called the BITE model, based on Lifton’s criteria that determines just how much social influence a group has over someone. BITE stands for behavior control, information control, thought control, and emotional control. You can go through each of those components and size up where a group falls, he says. Hassan refers to ISIS as a mind-control cult on the extreme negative end of the spectrum.

Are brainwashing and mind control the same thing?

An extreme version of mind control has been referred to as brainwashing, a term coined during the Korean War. Initially, it referred to prisoners of war taken by force who appeared to — over time and through enduring torture — buy into the communist point of view. Later, people began applying the term to nonforce situations, explains Hassan. But experts are divided on the use of the word — and the idea itself. Some argue that it’s outdated and specious. Others suggest it exists only to describe forceful situations.

“Brainwashing probably does, as a term, apply to the prisoners who turned in the Korean War, but it meets the criteria of decisions and changes in outlook and philosophy that occur under extreme duress,” H. Newton Malony, master of divinity, PhD, and former senior professor at the Graduate School of Psychology at Fuller Theological Seminary, tells Yahoo Health. (Malony is considered a “cult apologist,” a term given to experts who — controversially — don’t automatically assume that just because someone is different or is in a cultlike group means that he or she is “brainwashed” or under mind control. Maloney has also served as a resource to the church of Scientology.)

We all disagree with ISIS and its social influence, for example. ISIS cannot be defended in the way they line people up and kill them. “Mind control and coercive persuasion occur when a person is not free to counter a thought or enter into a dialogue. And ISIS’ actions represent the ultimate duress,” says Malony.

“Brainwashing is far toward the destructive end of influence,” adds Hassan. “It implies force — kidnapping, beatings, branding, or threatening to kill.”

With mind control, on the other hand, there’s an illusion of having control over your own life, says Hassan. There’s benevolence toward teachers or respected individuals “above” you, and “taking over someone” requires a process, he says.

The dark side of social influence

Less radical groups use psychological tactics as well. Take the homeschooling education program Advanced Training Institute, used by the reality-TV-famous Duggar family — where sexual abuse is, in a way, taught to be something that can be blamed on the immodesty of the victim.

Related: The Duggars’ Defense of Josh’s Molestations Is Disturbing — but Not Unusual

“I have a real issue with any group where there’s no encouragement for people to have a conscience and where a group proposes ideas like that women have to dress a certain way,” Hassan says. “That’s not on the healthy side of the continuum in my opinion.” Be wary of any group, he says, that uses ‘us versus them’ or ‘good versus evil’ simplistic ideology and can’t consider things from a different point of view.

Scientology is another group with recruitment tactics and practices that have been criticized as cultlike — particularly the practice of cutting ties with former members and suggesting that current members cut ties with those who don’t share their beliefs. “I’ve often felt that social shunning — where … [some religious groups] would not allow a person who left to have any contact with the group — is pretty powerful for us human beings. We are social animals and value family and people to a great degree,” says Malony.

The most powerful part of mind control, though, may come from the “t” part of Hassan’s BITE model: thought control. “Early on in my involvement with the cult, my father called and told me he read an article that Moon [founder of the Unification Church of the United States] had a gun factory,” he remembers. Normally, that would have stirred up doubt about Moon and the group. But Hassan had been trained by the group to do what he refers to as “thought stopping” to avoid critical thinking. So instead of asking his father: “How do you know that? What proof do you have?” he started chanting in his mind things like “crush Satan.”

“You become so polarized against the outside world,” he says. “There’s a tremendous amount of fear that you need to do things the right way.”

The installation of phobias as part of mind control

When the movie Jaws was released in 1975, a new phobia of sharks was born. People took their boats out of the ocean and kept their kids on the shoreline. “People became afraid,” says Hassan. “But the truth is that shark attacks are rare.” The point is, the movie led to a misfiring of our protection system — our way of safeguarding ourselves to perceive danger, says Hassan.

Similarly, mind-control groups use phobia indoctrination, says Hassan, to keep members obedient. Sometimes, these can be personalized based on what a group learns about a person. Other times, they’re broader: that you’ll get cancer or be hit by a bus, or your family will be killed if you leave the group. While Hassan was in the Unification Church of the United States, he “was drilled with the fear of evil spirits,” he says. “I didn’t even believe in spirits before I joined the group. But when The Exorcist movie came out, Moon gave a lecture saying that this is what would happen if you left the church.”

Sometimes, in extreme cases like sex trafficking or terrorism, these phobias aren’t just talked about — they are actual, possible outcomes. “Killing is not just a threat in your mind. It’s something real that happens, as seen through the way ISIS kills,” says Hassan.

So if someone is held under these mental conditions, Hassan says they can’t imagine leaving the group and being happy and fulfilled: “The moment they can, they are out the door.”

The power of the situation

But why don’t people just leave a mind-controlling group, you may wonder.

“The public tends to blame the victim and see people who have been mind-controlled as weak or defective instead of that they were subjected to a social influence program,” says Hassan. “And what social psychology teaches is that we are very social beings — we are hardwired to conform to what we perceive to be our social group. People identify with and follow who we believe to be authority figures, and this can be taken advantage of.”

In fact, social influence is much more powerful than you may realize. Take the classic and controversial Stanford prison experiment that Zimbardo conducted in 1973. (Ethically, it would never be allowed today.) He recruited Stanford University students to participate in a two-week experiment, in which 10 students pretended to be prisoners and 11 acted as guards. But the study didn’t last two weeks; it spun out of control, and after only six days, the prisoners were showing signs of depression, anger, and anxiety. The guards harassed prisoners, acting in sadistic ways. The study — which is taught in psychology classes around the world — sheds light on the idea of the power of the situation.

It’s research and theories like this, that could explain — at least in part — high-profile cases like that of Patty Hearst, granddaughter of publishing tycoon Randolph Hearst, who was abducted at age 19 by the terrorist group Symbionese Liberation Army, and went on to conduct a slew of crimes including robbing a bank.

Can anyone be mind-controlled?

There’s an important distinction between people who join cults and leave a previous life and people who were born into a situation, says Hassan. “For people born in, that’s all they know, and there’s a difference there in terms of sense of self. When I was recruited, my real identity got broken down and replaced by a cult identity.”

Take ISIS, for instance. Islam is, in itself, a peaceful religion, and most practitioners don’t agree with the extremist views of ISIS, which promotes violence in the name of the faith. But ISIS uses (and twists) concepts from Islam — concepts that may be more familiar to someone who is Muslim, who could be the target of recruitment. “You would have to say that their decision is not entirely counter to the culture around them, while it may seem incredibly counter to someone else,” Malony says.

The flip side, according to Hassan, is that people have vulnerabilities. “If someone is broken up or moving to a new city or graduating or has an illness or death, they may be more susceptible to someone new entering their life, because they’re vulnerable,” says Hassan. And people who may have trouble reading social cues correctly, or who have a very rule-bound approach to reality, can also be suggestible to cult recruitment, he adds.

Scarily enough, falling subject to mind control feels a lot like falling in love, says Hassan. “You have that strong feeling to be with someone, so you commit — that’s one slice of what it feels like,” he explains. “You feel swept up in this very intense emotional state. You have this very strong belief and hope that what you’re doing is the right thing.”

Where it differs from falling in love: There’s an “extreme dissonance between your real identity and your cult identity,” explains Hassan. For him, it was tumultuous going in and tumultuous coming out. In his pre-cult life, Hassan was a poet who read three books a week — to him, the essence of being human was being creative. But in the cult, he was told to cut his hair and wear a suit, go to bed at the same time every night, and throw out his poetry as a sign of devotion to God (which he did). “I became like the opposite of who I was before,” he says. “Life became about following orders.”

Undoing the damage

For two and a half years, Hassan followed the orders of the cult — until one day, he fell asleep at the wheel of a car and rear-ended an 18-wheeler. He wound up in the hospital, lucky be alive, and called his sister. Since she had never criticized his involvement in the cult — or accused him of being mind-controlled — he was allowed to visit her in what he thought would be an opportunity to meet his new nephew. It turned into what experts call a “deprogramming” from the cult. Hassan admits that, at first, he was convinced that the deprogramming team — a group of people including ex-cult members sent to help him start over — had been “sent by Satan.” But when he agreed to listen to ex-members, “lights started switching on.”

Gregory Sammons, the executive director of Wellspring Retreat and Resource Center in Athens, Ohio — a facility that offers clinical counseling, workshops, assessments, and support for survivals of spiritual abuse through high-demand churches and cults — helps people start switching the lights on for a living.

Wellspring is one of just two rehab centers in the U.S. geared toward people who have been in cults — MeadowHaven in Lakeville, Mass., is the other, says Hassan.

“The standard treatment program for cult survivors will typically last up to 10 days,” Sammons tells Yahoo Health. The program includes a minimum of 20 hours of intense one-on-one therapy with a clinician, standardized clinical assessments, and educational workshops, which help the survivor understand the cult phenomenon.

Rehabilitation is not always easy and depends on the person’s experience in the cult. “Early in the treatment, the client and clinician will walk through a timeline of life events before, during, and after the cult,” he explains. “Recognizing what was happening in our lives before the cult can often provide a frame of reference in our recovery, which includes taking back our identity.”

In sessions, the therapist recognizes what mind control is and conveys to the survivor that it was not his or her fault for being trapped in a cult. The goal: to provide an understanding of how it happened, why it happened, and how having the right tools can prevent it from happening again, he says.

Identity confusion, phobia disorders, decision-making, sexuality, sleep, and eating issues — not to mention trust issues and trauma-related symptoms — are just some of what people struggle with in rehab, says Hassan. “When you leave a group, you’ve got all of this indoctrination in your head and sometimes you don’t know what’s true.” He adds that most people don’t rationally leave cults having researched it, but rather run away — which makes coming to terms with the change even more difficult.

“Most people operate with an incorrect notion that only weak people who are looking for someone to control them wind up in cults or mind-control scenarios,” says Hassan. “That’s simply not true.” Destructive groups don’t always look destructive at first. They also come in all shapes and sizes — therapy cults, political cults, business cults, religious cults, terrorist groups, and sex trafficking, he says. And humans are vulnerable simply by being a social species.

Education, though, about mind control, cult techniques, and psychology can help, says Hassan. And knowing how to spot unhealthy influences can help you avoid them.

Read This Next: 10 Questions That Tell If Your Childhood Was Bad for Your Health