Menu
Menu
inquire

Beware of mental traps

Beware of mental traps

As human beings, our decision-making is severely influenced by emotions. And most of these emotions can be exploited by people with malicious intent

Even when our thinking isn’t clouded by emotions, we aren’t as rational as we’d like to believe we are. Kahneman's work, particularly his Nobel-winning research, challenges the notion of humans as perfectly-rational beings.


What are mental traps?

Psychologists have been studying the patterns of irrationality in our cognition for quite a while now. There’s a significant amount of knowledge and understanding around so-called mental traps (known as cognitive biases in the academic world).

A mental trap (or cognitive bias) is a systematic error in thinking that occurs when people are processing and interpreting information in the world around them and affects the decisions and judgments that they make.

It’s widely believed that mental traps have their roots in the evolutionary history of humans. These biases are thought to be the byproducts of mental shortcuts, known as heuristics, that evolved to help our ancestors make quick, efficient decisions. Often in an environment where rapid response meant the difference between life and death. 

Many cognitive biases may have developed in response to the challenges faced by our ancestors in the "environment of evolutionary adaptedness" (EEA). This term refers to the environment to which a particular species is adapted. 

For early humans, this environment was characterized by resource scarcity, physical dangers, and the need for social cooperation.

Cognitive biases likely conferred certain survival and reproductive advantages. For instance, the availability heuristic, where individuals judge the frequency or probability of an event by how easily they can recall similar instances, would have been useful in quickly assessing threats (and staying alive). Remembering and overestimating the frequency of dangerous encounters would have promoted cautious behavior, enhancing the probability of survival.

Humans are inherently social beings, and many cognitive biases reflect this social nature. Biases like in-group favoritism (referenced in our article about creating a company culture for security) and conformity bias likely evolved because they facilitated group cohesion and cooperation. These were vital for survival in hunter-gatherer societies.

The development of cognitive biases can also be seen as a trade-off. The human brain, despite its complexity, has limitations in processing power. Heuristics and biases allow for quicker decision-making by simplifying complex information, even though this can sometimes lead to errors in judgment.

While these biases were helpful in prehistoric environments, they’re not quite as suitable for modern society. The same shortcuts that helped our ancestors survive can lead to errors in our complex, information-rich world.

Mental traps can make us behave irrationally. And the problem with irrational behavior is that it’s easily abused. Social engineers can use them to exploit us and our systems. And in the 2020s our go-to system is the one that’s always in the corner of our eye, easily within reach. Constantly pinging with tempting-looking notifications.

Let’s take a look at a few mental traps within the context of cybersecurity.


Anchoring bias

Imagine an attacker wants to infect a company’s office network with a virus. The virus is on a flash drive and if anyone inserts that drive into their machine, it will install itself and infect the entire network. 

The goal of the attacker is to make anyone insert the flash drive into any computer. But for argument’s sake, let’s say that this particular office is pretty well protected and there’s no easy way to access it physically.

What can the attacker do?

He approaches the security officer and sees a computer over the officer’s shoulder. The attacker pretends to be an employee who needs to access the office because he forgot to send a very important email from his computer. 

The security officer demands some ID and office pass, and the attacker acts as if he’s forgotten his, feigning sadness and frustration. He says he’s going to be fired for sure if the report isn’t sent on time, and begs for access “just this once”. But again the security officer says no. 

Time for another approach. What if, the attacker asks, the security officer could do him a massive favor and send the report from his computer. The attacker copies the data from his laptop to the usb flash drive, showing exactly what he’s doing to the security officer and appealing to the goodness of his heart to send the report via email. 

The security officer looks pensive. But why? 

Why would he be more receptive to this second tactic?

Well, this request appears relatively minor compared to the first one. This poor guy probably is a genuine employee and he looks pretty desperate. He’s not even asking for physical access to the office anymore. 

This is how easily an organization’s entire network could be infected.

If the attacker approached the security officer and asked him straight off the bat to insert the flash drive into a network-connected computer, there’s no way the officer would comply.

This tactic is a clever example of the anchoring bias. It creates a context in which the security officer's decision-making is influenced by the initial, more extreme request. And so the subsequent smaller request seems harmless by comparison.


Sunk cost bias

Our second mental trap that cybercriminals look to exploit is called sunk cost bias.

Think for a second about those infamous Nigerian scam emails that you’ve almost-certainly received at some point in the past decade or two. 

They rely on their victim making an initial payment in the expectation of then receiving a large prize or reward. When there is no prize on the horizon, rational thinking and behavior would suggest the victim cut their losses, realize it was a scam, and admit to themselves that they’ve made a big mistake.  

But in reality that’s not always what happens. 

In many cases, the scammers request even more funds. And the victim, despite seeing zero return from their initial investment, continues to send money. 

Why?

Well, they’ve convinced themselves that by investing more, they’re getting closer to receiving the promised reward. The big payoff. 

This is sunk cost bias.


Neglect of probability, illusion of superiority, and the framing effect

Our brains don't operate that well with probabilities because we often don’t have the innate perception of how they work. 

Take the following example. Say one hundred people are given the option of entering two different prize draws:

In the first draw, the pot is $10 million and the chances of winning are one in 10 million. In the second draw, the pot is much smaller at $10,000, but the odds are stacked much more in their favor at one in 10,000. Still, the majority of the participants would choose to take part in the first game, mesmerized by the thought of becoming a multimillionaire. 

Let’s go back to the infamous Nigerian email or text scams again. The attackers usually present opportunities that seem too good to be true, such as the possibility of receiving a large sum of money. 

The rarity of this opportunity can play on an individual’s desire and desperation to the point that they overlook the low probability of it being genuine. This is an example of the neglect of probability bias effect.

This mental trap is closely linked to another one - the illusory superiority bias, or the belief that you’re better and more deserving of fortune than others. Be that when it comes to intelligence, ability, or morality. 

“No wonder this exclusive offer has arrived in my inbox.”

Our decision making is also influenced by the way information is presented to us rather than just by the information itself. 

We obviously prefer the sound of surgery with an 80% success rate than one that comes with a 20% death rate even though they both have the exact same outcome. This is called the “framing effect”. The same information can lead to different conclusions or actions, depending on whether it’s presented in a positive or negative light.


The halo effect, ingroup bias and social conformity bias

At the heart of social engineering is good acting. And good acting relies heavily on the Halo effect. There are studies that prove we treat better-looking or more-professionally looking people better and tend to trust them more readily.

With good open source intelligence (OSINT), attackers can find out about a difficult experience their victim might have had in her life, be that losing a loved one, having relatives suffering from addiction problems, or the fact that she’s an immigrant who fled from war in her homeland. The harsher the experience, the stronger the ingroup bias will hit home when the attacker hints at a shared experience.

The ingroup bias has deep roots in our evolution as social animals, as we touched on earlier. Human beings want to empathize and connect with “their people” - this empathy and trust was vital for our ancestors’ ability to survive and prosper and for the idea of countries and religions to flourish.

Take a look at this video about the power of social conformity.

Having watched it, consider a scenario: 

An employee possesses a security pass that attackers are aiming to copy. 

Merely asking the employee to hand over the security pass would certainly result in non-compliance. But if the attackers knew that the employee was likely to be at a specific location - say at an airport - they could implement a more intricate scheme involving multiple actors. 

One actor would assume the role of a security guard, while others would impersonate regular passengers sitting in the same waiting area as their target employee. The actor portraying the guard would then enter this area and initiate a 'security search'. As part of this orchestrated act, all the actor-passengers would obediently submit their belongings for inspection. 

Observing this, the victim would likely follow suit, conforming to this perceived norm and unwittingly exposing the security pass for copying. 

This is an example of the bandwagon bias (or social conformity) in action.


Overconfidence bias

The overall effect of multiple biases is worsened by the overconfidence bias: we often tend to overestimate our abilities. For example, 65% of Americans think they have above-average intelligence. This means that even if we know that there are certain biases, we might believe we’re almost immune to them. 

“Other people might be irrational and liable to fall for scams, but not me.”

According to the UK authorities, higher levels of education and overconfidence in financial abilities is one of the risk factors for social engineering. The more that we think we know about a topic, the higher the chance we can feel overconfident and fall prey to a carefully-crafted attack.

That’s why even those of us who work in cybersecurity aren’t completely immune to scams.


How to overcome mental traps

Building awareness is vital: people must be aware of these mental traps if they’re not to fall foul of them.

That said, awareness alone isn’t enough. Even those with awareness can be overconfident and some biases can be accidentally reinforced.

Many mental traps are byproducts of heuristics which can be triggered when immediate action is required (what is known as System 1 thinking). And it just so happens that the modern world of smartphones and social media with all those notifications and distractions are priming you to embrace this mode of being. 

A lot of our problematic biases can be mitigated simply by removing the speed constraint. Slow down and take your time. When you do so, your ability to think critically will improve, allowing for more deliberate and rational decision-making. 

When we pause and reflect, we engage System 2 thinking, which can help us to identify and mitigate the influence of biases inherent in System 1's rapid processing.

Speed is not the only constraint we need to remove, however. Be self aware as soon as you’re reacting to a certain request with a warm, fuzzy feeling of social appreciation or a deep underlying feeling of self-exceptionalism. 

When dealing with such urges, engaging in more deliberate, System 2 thinking can also be beneficial. These feelings are often associated with biases like the need for approval, the superiority bias, or the illusion of uniqueness.

Using structured decision-making techniques can help in overcoming mental traps. Techniques like creating a variety of options, weighing them against a set of criteria, and systematically analyzing these options can mitigate the impact of cognitive biases. 

When presented with a request, write down the reason for it. Then list several options, starting with “comply” and ending with “do nothing”, and analyze the rationale and outcome for each action. While this might seem like a banal exercise, it can help you to stop bad habits that heighten the risk of you falling for a social engineering attack.

Not so long ago, we published an article about boosting your social engineering awareness, which expands on this advice.

And if you want to explore the psychology behind mental traps a little bit more, here’s some further reading that we recommend:

  • Thinking Fast and Slow by Kahneman
  • Predictably Irrational by Ariely
  • The Art of Thinking Clearly by Dobelli
  • Influence: The Psychology of Persuasion by Cialdini
  • Behave: The Biology of Humans at Our Best and Worst by Sapolsky