Creating a company culture for security

Creating a company culture for security

A company’s culture goes far beyond what's outlined in a job description. It encapsulates the foundational principles and practices upon which a business operates. 

The cultural characteristics of an organization can even shape its cybersecurity resilience.

After all, in the digital age security isn’t determined by technological fortifications alone. Social engineering attacks that prey on human emotions remain a top cause of security breaches. 

And so beyond firewalls and encryption, the very culture of an organization can play a key role in reducing vulnerabilities. From the way employees communicate with one another to the rigidity of company security policies, the internal dynamics of a business can either fortify its defences or expose it to breaches.

In the following paragraphs we’ll delve into the importance of creating a company culture for security. We’ll do this by exploring how various negative cultural norms can undermine an organization's ability to stop sophisticated cyber threats.

Team silos

Humans have an inherent tendency to categorise, which can lead to phenomena like ingroup favouritism and outgroup prejudice

In companies where departments are separated based on specializations, this natural inclination is encouraged and amplified. 

The result? Reduced inter-departmental communication. 

Employees become less inclined to engage with (or seek assistance from) those outside their immediate department.

This lack of communication often means employees remain unfamiliar with their colleagues in other departments. And this detachment fosters information silos where knowledge about one department's operations and expertise remains confined within its boundaries. This workplace environment is ripe for exploitation by social engineers. In fact, if you asked one how they’d want their target company to be set up, they’d describe silos just like these. 

Let’s look at an example scenario:

An attacker decides to impersonate a member of a bank’s legal team and requests IT access. In this scenario within a company operating in departmental silos, the target IT personnel could be unfamiliar with anyone from the legal team due to the absence of relational communication. As such they’d probably lack understanding of the legal team's functions, boundaries, and responsibilities. What type of questions might they realistically ask them? The IT team might even be hesitant to verify the legitimacy of the request with someone within the legal team, influenced as they are by outgroup prejudice. 

“Who’d want to talk to those slackers in suits, anyway?”

A more generalised team structure, where knowledge is diverse (and shared) can significantly reduce the risk of such breaches. Remember, security is everybody's responsibility and the best way to make people feel empowered is to encourage clear communications among employees. 

This is true within a single team as well as across departments. Engineers who operate in silos, focusing only on their small facet of the application they’re developing, might miss some crucial contextual information needed to adequately secure the app they're working on.

We’re living in a world where it’s normal to split employees up based on areas of expertize without fully considering the potential drawbacks of doing so. Clearly, it makes sense to do this to some extent, but there is a certain irony in attempting to counteract the resulting fragmentation with superficial solutions like corporate parties rather than addressing the root cause of the issue.

So, our first tip for creating a company culture for security is to be wary of silos and to encourage empathy and understanding both within and across departments.

Command and control

Many corporate environments operate under the "disagree but commit anyway" ethos, which is itself a derivative of the military's "command and control" principle. While this approach may streamline decision-making (and be perfectly viable in a military environment), it’s less well suited to modern organizations. 

Not least because this approach often stifles open dialogue and questioning. Employees learn they simply have to execute orders without asking questions if they want to progress. Even if they have concerns or reservations.

Again, this culture can be fertile ground for social engineers to succeed. 

By impersonating your boss, they know they can potentially manipulate your entire department. Employees conditioned to complying without asking questions then become vulnerable targets.

We wrote recently about how social engineers can manipulate feelings of fear or anxiety. And one very prevalent example of this is the concern over losing one's job - something which has provoked a sense of insecurity among many workers. By the end of September 2023, more than 170,000 US-based tech employees had lost their jobs this year alone

Research indicates that the impact of these layoffs is not limited to those who lose their jobs. Both those laid off and the survivors often experience diminished motivation and trust towards their employers. Many report feelings of anxiety or depression. And 65% of those who remain report feeling overburdened.

It turns out this cocktail of reduced motivation, increased workload, and the looming threat of job loss has significant implications for cybersecurity threats. Employees, feeling like expendable parts in a corporate machine, might either become susceptible to boss impersonation attacks due to their fear of the repercussions of inaction, or they become apathetic towards the organisation's security altogether. 

In other words, when employees perceive themselves as easily-replaceable cogs in a machine, they can be much more susceptible to social engineering tactics. So, fostering a culture where employees feel empowered to share ideas and not just do as they’re told can aid your company’s security posture.

Blame culture

A pervasive cultural trait that exacerbates the fear of job loss is the blame culture. 

In many organizations, when things go wrong the immediate response is to find someone to hold accountable. But humans make mistakes and, while the risk of errors can be minimised, it cannot be removed entirely. The focus ideally should be on mitigating the impact of these mistakes rather than pointing fingers afterwards.

Historical events underscore the dangers of adhering to a blame culture. The Chernobyl meltdown in 1986 serves as a stark reminder. The catastrophe's aftermath was made worse by the reluctance of officials to admit the severity of the situation. Their fear of being held responsible and potentially losing their jobs was placed above other considerations. This flawed mentality led to delays in addressing the crisis, increasing the disaster's human toll.

A more contemporary, cybersecurity-focused example is the 2016 Uber data breach. The company's then-security chief, Joseph Sullivan, opted to conceal the breach and pay hackers the ransom they demanded. This decision, likely driven by the fear of being blamed for the security lapse, resulted in legal repercussions for Sullivan and a hefty $148M settlement for Uber.

In a company where a culture of blame prevails, an employee might hesitate to confess that they accessed a harmful file leading to a computer infection. This is a big problem, because detecting the attack promptly can help in reducing its overall impact.

For social engineers, a company beset by blame culture is akin to a goldmine. By crafting threats or manipulative requests that prey on an employee's fear of being blamed, they can coerce compliance. A seasoned social engineer might even gather intelligence on an employee's past errors, leveraging this information to craft a very persuasive attack. 

For the reasons above, you should do everything you can to encourage a more open and transparent culture in your own organization.


Micromanagement, while distinct, shares similarities with the "command and control" approach.

Many companies elevate individuals to managerial roles who may not be equipped to handle the complexities of leadership. These managers, often overwhelmed by the intricacies of their roles, sometimes resort to exerting excessive control over their teams, focusing on the minutiae of daily operations. This approach to management is called micromanagement.

Esteemed thinkers and authors, ranging from Drucker and Ackoff to McGregor and Deming, have labelled micromanagement as a detrimental force in the workplace. Their consensus is clear: micromanagement stifles employee motivation and initiative. When constantly monitored and dictated to, employees become conditioned to act only upon explicit instructions. They refrain from independent decision-making, even in situations that demand it.

In the realm of security, this conditioned passivity can be very dangerous. If employees encounter anomalies or potential threats that aren't covered by company security policies, they might choose inaction over initiative. Given that timely responses are crucial in mitigating many security threats, such hesitancy can make vulnerabilities more severe. In a micromanaged environment, employees might delay action, waiting for directives, thereby leaving the organisation exposed.

So, while micromanagement offers managers an illusion of control, it can actually weaken an organization's security posture by suppressing employee initiative.

Excessive competition

A widespread feature of many corporate cultures is the promotion of individual or team competition, often manifested through performance reviews. Despite some within management science highlighting the adverse effects of such practices, they remain widespread.

In environments that foster competitiveness, employees or entire teams often feel isolated. They operate under the perception that their success comes at the expense of their peers. This mindset can be detrimental, especially in areas like IT security where collaboration is crucial.

Research indicates that competition can stifle collaboration. In the realm of cybersecurity, open communication is vital. Employees should be encouraged to share observations of potential threats, seek clarity on ambiguous communications, and admit mistakes without fear of judgement. However, in a competitive setting, the fear of appearing less sure than your peers can deter such transparency.

Furthermore, an overly-competitive culture can foster the same departmental silos we covered at the beginning of this article. Teams, driven by the desire to outperform others, may subconsciously (or consciously!) withhold information or fail to communicate in an effective or timely manner. Such fragmentation presents opportunities for malicious actors, who can exploit this lack of cohesion and unity within the target organization.

A lack of training

A prevailing misconception in many corporate circles is that security, much like quality, is a matter of control rather than assurance. But drawing parallels with quality assurance, which proactively ensures products meet set standards, security assurance should similarly adopt a proactive stance. Merely reacting to breaches as they occur can be costly.

Social engineering relies on exploiting human psychology. Given its evolving nature, staying ahead requires consistent updates about risks and a certain adaptability. Research underscores the importance of regular, up-to-date training as a primary defence against such attacks. It's not enough to only incorporate training during onboarding; periodic security interventions are crucial to reinforce the message and enable employees to practice their acquired skills.

Beyond formal training sessions, cultivating a pervasive company culture for security is essential. This entails transcending the notion that security is the purview of the IT department alone. Instead, it should be woven into the very fabric of company culture. When employees across hierarchies know the significance of security and are armed with the knowledge they need, the business’s collective resilience is improved.

This links nicely to another cultural facet that can compromise your company’s security if you’re not careful: employee overload. The effectiveness of training diminishes if your employees are too time-constrained or exhausted to see threats emerging on the horizon.

Employee overload

When people are burned out, their attention to detail, productivity, and overall quality of work is negatively impacted. The stresses stemming from work overload can also cause them significant health issues.

There are a variety of cybersecurity implications, too. Overburdened employees, in their haste, might inadvertently bypass crucial security steps, neglect the smaller details, or unknowingly overlook things. This can manifest itself in a number of ways, from failing to recognize the signs of a security breach to inadvertently introducing vulnerabilities - say by clicking on a bogus link that at a first tired glance appeared to be a genuine, urgent request. 

Overloaded employees might not prioritise security protocols such as timely password updates. Even if they participate in security training, their engagement might be limited, leading to gaps in their understanding. The quest for efficiency might drive them to adopt shortcuts, potentially compromising security. Examples include resorting to unsecured personal devices for work-related tasks or employing unauthorised software tools.

Employees aren’t always vigilant or alert to unusual activities when burned out. And this makes it easier for malicious actors to operate undetected. Tired employees might also be less invested in the company's security wellbeing as much as usual, leading to a more lax attitude towards security.

A lack of transparency

Humans possess an intrinsic need to bridge gaps in their understanding - a phenomenon formulated in the "information gap theory". This theory posits that when individuals see a void in their knowledge, they develop a sense of deprivation which forces them to seek out that information. Essentially, a deficit of information gives birth to curiosity.

Decent security training can certainly help employees realise that downloading and executing “Britney-Spears-naked-photoes.exe” from a random email might start an attack.  

Such training acts as a safeguard, reducing the potential for external attacks that prey on human curiosity towards the outer world.

But no training can prevent people from being curious about what the recruitment plan will be for the next year or what promotions are possible. This information is directly related to work itself, encompassing activities like problem solving, research, innovation, and creative thinking. All of which inherently rely on curiosity as a driving mechanism. Suppressing this curiosity is counterproductive as it potentially stifles the very essence of intellectual work. 

A better strategy to mitigate curiosity-driven vulnerabilities is to enhance information transparency to the highest possible degree. Some companies have even ventured into making almost all internal details, including salaries, accessible to everybody.

While complete transparency like this might remain an aspirational ideal or perhaps even not advisable for your organization - as studies from sources like HBR suggest might be the case - it's important to embrace some form of transparency. It’s about striking the right balance. 

By demystifying certain aspects of the organisation and making information as accessible as practically possible, you can potentially reduce curiosity-driven vulnerabilities.


Conservatism signifies a resistance to change. A desire to maintain the status quo irrespective of external changes. The inability to adapt and evolve in response to market dynamics has been the downfall of many companies, with Kodak's story serving as a particularly effective example.

Conservative culture can manifest in various ways, from an inability to pivot in response to market changes to an overemphasis on setting rigid rules and routines in the pursuit of perpetual efficiency.

Consider a company that wholeheartedly adopts Scrum, mandating all its teams to adhere to the framework. Scrum masters, following the Scrum guide, ensure the preservation of the process. This rigid adherence can be the first step towards conservatism, where established processes become sacred and immune to criticism.

Questions like, "Does security testing fit in this sprint?" might arise, but the ritualistic following of the process will prevent much discussion or change.

So, how can this have an impact on security?

A company can become predictable. The repetitive nature of its practices can make life easier for would-be attackers.

There can also be a lack of critical thinking. Employees might cease to question or critically assess ritualistic processes, adhering to them without understanding or questioning the underlying rationale for adopting them in the first place. This lack of understanding can make them more vulnerable to outside manipulation.

Then there’s an overall resistance to change. Even when security vulnerabilities are identified, conservative companies might resist modifying their age-old processes - something that perpetuates security weaknesses.

Finally, there’s complacency. A false sense of security can sometimes emerge, where employees believe that merely following the established "ritual" ensures safety. And so they overlook emerging threats.

Conservatism can also often lead to a lack of training. After all, if everyone believes that the established routines are sufficient and effective, why change them?

Build a company culture that's resistant to emotional manipulation

As we navigate the landscape of cybersecurity, it becomes evident that humans remain the primary victim for social engineering attacks. As highlighted in our previous article, "the seven principles of social engineering", many human emotions are susceptible to exploitation. These emotions, while integral to our human experience, can inadvertently become gaps in our collective armour.

While emotions serve as vital catalysts driving our motivation and responses to various stimuli, they cannot - and should not - be suppressed. Instead, it’s vital to design our structures and work processes in such a way that they reduce the likelihood of attackers exploiting emotions such as fear. 

The realm of organizational psychology offers invaluable insights into the genesis and influence of these emotions. By delving into the disciplines of management and psychology, we can better understand and design our systems to be more resilient against potential threats.

But it's vital to understand that the task of system design is not a one-off endeavour. 

In a world characterized by rapid and relentless change, attackers often adapt faster than their targets do. The recent shift to remote working, propelled by the covid pandemic, has expanded the potential attack surface with malicious entities exploiting digital communication channels. 

To safeguard against such evolving threats, it's crucial to constantly reassess and recalibrate our organizational practices. 

That way we can make sure they align with the ever-changing dynamics of the digital age.

Find out more about the history and psychology of social engineering.