Hero culture in cybersecurity: origins, impact, and why we need to break the toxic cycle
Looking at hero culture in cybersecurity, why it occurs, how it is reinforced, what the net impact of it is, and why we need to break the toxic cycle
Welcome to Venture in Security! Before we begin, do me a favor and make sure you hit the “Subscribe” button. Subscriptions let me know that you care and keep me motivated to write more. Thanks folks!
My new best selling book “Cyber for Builders: The Essential Guide to Building a Cybersecurity Startup” is now available on Amazon. This book is unique as it talks about building cybersecurity startups. It is intended for current and aspiring cybersecurity startup founders, security practitioners, marketing and sales teams, product managers, investors, software developers, industry analysts, and others who are building the future of cybersecurity or interested in learning how to do it.
In October 2023 at ACoD, in one of the sessions of the philosophy track, we ended up discussing an interesting topic - hero culture in cybersecurity. What started as a brief chat, inspired me to think about this problem more. This article, co-written with Kymberlee Price, a fellow ACoD attendee, explores the topic of hero culture in cybersecurity, and several adjacent problems, in greater detail.
The definition of hero culture and its signs in cybersecurity
Hero culture is when a company or a functional area is run by people who are extraordinarily talented, gifted, and strong, putting in superhuman effort or hours to achieve their objectives. This may be because the team doesn’t have enough staff with time or because there aren’t enough staff with the right knowledge and capabilities; in either case, the heroes carry the burden of working double-time (or more) to do what needs to be done. While there are some benefits to this culture, there are also several negative consequences to consider.
What does a security hero look like?
Caregiver: The Caregiver Hero will regularly work 60-80 hours per week to “make things happen” for the company, customers, or their peers on a (typically) understaffed security team. They don’t take vacations because they feel like the company won’t be able to function if they are away (and sometimes that feeling is validated by rejected time off requests or emergency calls on the weekend / while they’re out sick). Caregivers feel responsible for helping to shoulder the team’s heavy workload and feel guilty taking time off, knowing that their peers will pay the price and get penalized with covering their colleague’s work on top of their own. They may also avoid asking for help because they don’t want to add work for their peers. These folks routinely check email and instant messenger at night and on the weekends, even when there is no critical incident or deliverable active, either because they are afraid of the routine backlog awaiting them in the morning if they don’t, or they are afraid to be perceived as not working hard enough to keep up with the demands of the business.
Lone Ranger: These individuals centralize a lot of knowledge and use “compartmentalization” as justification to not share information with peers. They are often hesitant to train others to cover for them, saying they don’t have time or “it will be faster if I just do it”. They are eager to come in and “save the day” when others will predictably need their help. Meetings often need to be rescheduled when these heroes cannot join a call because they are essential and have no backup. Like the Caregiver Hero, the Lone Ranger Hero will regularly work 60-80 hours per week to “make things happen” for the company and customers because they perceive that no one else on the team can do what they do. They don’t take vacations because they feel like the company won’t be able to function if they are away. Often, they’ve intentionally or unintentionally cultivated a role that is a single point of failure for the business because they are proud of being invaluable to the organization.
If all this sounds familiar, it is because it should be: hero culture is incredibly common in cybersecurity. The language and imagery in the industry reinforce the idea that security professionals are superheroes with extraordinary abilities who are needed to save the planet from impending doom. Hero culture has become such a critical part of the security field that it is hard to find anyone who questions the origins and the consequences of this reality. Even marketing teams have picked up on the imagery; in an attempt to build rapport with their buyers and users, security vendors also like to emphasize the heroism of cybersecurity practitioners, which further fuels the problem.
Image Sources: CyberHero Network, McMaster University, LIFT, Security Serious Unsung Heroes Awards
Why hero culture occurs
Analyzing the roots of hero culture in cybersecurity is challenging because it is like crab evolution - lots of different paths result in the same outcome. Below are several reasons why hero culture is so deeply embedded in the security discipline, but this is not an exhaustive list. It goes without saying that these perspectives greatly oversimplify reality.
Original hacking culture
Cybersecurity did not start as a professional discipline with seniority levels, certifications, maturity models, and conferences. Information Security evolved from the culture of hacking and phreaking, both of which were a sort of tech magic you could impress your friends with. Tech enthusiasts would gather together either in person or on online forums tinkering with different technologies, breaking into them, taking them apart, and putting them back together. Early security practitioners were driven by a sense of curiosity and discovery. Moreover, there was a spirit of friendly competitiveness, and being the person who solved a hard problem - a hero - wielded a lot of social capital. Everyone wanted to be leet.
Presence of an adversary
As hacking evolved, the community split (roughly) into two cohorts: those breaking into things, and those defending them. The line was often blurry, especially as governments were significantly behind on defining cyberlaw, and as it took a long time to evolve our approach to vulnerability disclosures and bug bounties. Either way, with the split into attackers and defenders, winning becomes even more important and triggers a high-stake stress response rooted in fear: if a practitioner loses their guard, the adversary can harm the organization and steal the data they are in charge of protecting.
Needing to rely on people who “know what they are doing”
Companies took a long time to recognize that security is something they need to worry about, and when they did, there was a lack of understanding of how many security engineers would be necessary to do a $THING. There was no cybersecurity knowledge base, and few practical standards or best practices one could follow, so companies had no choice but to look for people who “knew security” and could just “do their thing” in their enterprise.
Relying on individuals and their ability to do what needed to be done in an unstructured environment was a logical step. Moreover, even as the industry was evolving further and the first security frameworks and best practices started to emerge, they were only relevant to the largest technology enterprises that had developed them, such as Microsoft. The vast majority of companies could not see the relevance of these “best practices” to someone with their level of resources, so they doubled down on delegating hard security problems to their IT teams asking them to do the best they can with what they have.
Special relationship between cybersecurity and the military
Another factor that contributed to the growth of hero culture in cybersecurity is security’s special relationship with the military and intelligence agencies. Since the government holds a monopoly on force, it is in the military and at the special agencies where the most cutting-edge offensive cybersecurity capabilities are developed, tested, and deployed.
Cybersecurity has borrowed a lot from the military. The industry uses the military's approaches to staffing security operations centers (tiered analyst model), security practitioners are frequently awarded medals and medallions for their service and for winning hacker competitions, and some argue that the security industry’s love for abbreviations also comes from the forces. Most importantly, the cybersecurity field attracts ex-military talent. People who served in the army bring to security a sense of mission that the cybersecurity profession is known for.
Needing to rely on people who “know what they are doing” (yes, again)
Because early small teams of experienced experts got a lot done by pouring their lives into their jobs to prove their value, a lot of companies have built a staffing model around this behavior. Seeing the productivity of these teams, and that salaried security practitioners would work overtime to make sure security work was completed, they got comfortable understaffing security teams, which they didn’t really understand the staffing needs of to begin with. What started as a natural desire of corporations to save money has evolved into a chronic problem of under-resourcing security teams.
The issue isn’t that the desire to be seen as a hero is bad on its own; the issue is that companies are taking advantage of it, and it is causing employee and business harm.
Psychology of the cybersecurity profession today
In recent years, we have started to talk about the importance of mental health in the cybersecurity field. This is to a large degree thanks to initiatives such as Mental Health Hackers, a community project led by Amanda Berlin, Sr. Security Architect at Blumira, as well as the work of researchers such as Dr. Stacy Thayer, Professor of Cyberpsychology, host of the CyberPsych podcast and former founder of the SOURCE Boston security conference who speaks at security conferences on the dangers of burnout.
Cybersecurity as a slot machine
While the problems of mental health in cybersecurity are no longer taboo, some topics are rarely being discussed, such as the degree to which, on a psychological and biochemical level, certain areas of cybersecurity resemble addiction to gambling, both in their obsessive pursuit and in their powerful dopamine reward when the intermittent jackpot is hit. There have been a few talks about gamification - such as this talk Kymberlee Price gave in 2017 which discusses operant conditioning and the addictive properties of intermittent reinforcement on behavior.
Per Binde, associate professor of social anthropology at the University of Stockholm, Sweden, is one of the world’s leaders in the area of gambling studies which he has been focused on since 2001. In 2013, Per published an article “Why people gamble: A model with five motivational dimensions” in the journal International Gambling Studies, where he analyzed what drives people to gamble. The four optional motives for gambling, according to the paper, are:
The dream of hitting the jackpot
Social rewards
Intellectual challenge
Mood change
The fifth motive - the chance of winning - is essential to gambling and therefore must be always present.
Source: Why people gamble: A model with five motivational dimensions
It is not hard to draw parallels between slot machine gambling and looking for bad guys - threat hunting and incident response. Both activities share the same psychological model: the intermittent reinforcement of dopamine people get from slot machines, when sometimes they win, and they must keep looking non-stop through logs because maybe, just maybe, they will get lucky and find signs of adversarial behavior. And, if they don’t find them today, then they might get lucky and hit the jackpot tomorrow. After a few days of sleepless nights trying to find the signs of malicious activity, people get what they were looking for which leads to the endorphin spike and causes them to keep searching for more. Since once in a while, security professionals do find what they are looking for, the behavior gets reinforced, and a new cycle begins.
Wikipedia calls this cycle a compulsion loop or core loop, defining it as a “habitual chain of activities that will be repeated by the user to cause them to continue the activity. Typically, this loop is designed to create a neurochemical reward in the user such as the release of dopamine”. The same page explains that “Compulsion loops are deliberately used in video game design as an extrinsic motivation for players, but may also result from other activities that create such loops, intentionally or not, such as gambling addiction and Internet addiction disorder”.
Image: Compulsion loop in cybersecurity
Other factors help turn looking through logs into addictive behavior, and many of them have their equivalencies in slot machine gambling:
Near misses - losing situations that are perceived to be close to wins by the gambler. In threat hunting, those are cases when something that almost looks like malicious behavior turns out to have benign origins.
High event frequencies - people who are sensitive to rewards are often attracted to games with fast playing speeds. Cybersecurity with its large amounts of logs and detections, each of which can be a sign of adversarial behavior, creates perfect conditions for developing an addiction.
Illusion of control - the belief that a skill can affect the outcome of a random or chance event. To a large degree that is true in cybersecurity, yet no one can truly predict with certainty when the company will be attacked, what methods the attackers will use, and so on.
Anticipation - as an article by Jakob Linnet featured by the National Library of Medicine explains, ”dopaminergic anticipation of reward and uncertainty might represent a dysfunctional reward anticipation, which reinforces the gambling behavior despite losses”. In cybersecurity, the anticipation of finding signs of an adversary reinforces the desire to go through more logs.
Attentional bias - another article featured by the National Library of Medicine explains that disordered gamblers often exhibit disproportionally more attention to gambling-related stimuli relative to non-gambling stimuli.
The issue, as always, isn’t that threat hunting and incident response are addictive (that’s just a scientific fact). The problem is that intentionally or unintentionally, companies have learned to exploit and reinforce this hero behavior.
How hero culture is reinforced
We’ve covered how hero culture occurs, and the principles of addiction, but how is hero culture reinforced in the workplace daily? For employees, it comes down to three basic principles:
Identity: “If not me, then who?”
The sense of mission exhibited by so many working in the field is arguably the main reason why our digital infrastructure has not blown up in our faces just yet. Historically, few public and private institutions have been sufficiently investing in cybersecurity, and a lot of the critical work was done by people driven to do the right thing despite not being economically rewarded for their labor.
While that is true, this sense of mission has a dark side. A large percentage of people I met in the industry have had their entire identity consumed by their work in cybersecurity. Both the public and the private sector have been reinforcing this problem almost unanimously, using the military and superhero language that emphasizes the sense of mission and takes it to the extreme. Statements such as “I am saving the world”, “If not me then who”, “I have to sacrifice everything for the greater good”, “With great power comes great responsibility”, and “We are the last line of defense”, to name a few, are so common that we no longer question if they should be. When people who see themselves first and foremost as security professionals at a specific company get laid off from their jobs, they find it much harder to bounce back. Unable to quickly update their resume and look for a new role, many are forced to rethink their own identity because who are they if not an employee of some company?
An over-exaggerated sense of mission and the hero persona, is, in most cases, a coping mechanism that makes people rationalize the sacrifices they are making in their personal lives for their employer’s benefit.
Belonging: Searching for acceptance and recognition
Historically, security teams have been seen as something external to the rest of the organization, and their contributions have been largely underappreciated. Unlike software developers or designers who get a sense of achievement when their work is being used by the company's customers and generates revenue, security practitioners, with the exception of some security engineers, don’t get to proudly showcase the results of their work to their friends. Commonly seen as a “department of no” and a business function that seeks to make everyone’s lives harder, people working on security teams were not (and in most places, are still not) being properly rewarded and recognized for their hard work.
Most people want to know that what they are doing is meaningful and has a positive impact on their company, and security professionals aren’t an exception. Since the only time they are often noticed and recognized for their work is when they do something “heroic” (working long hours to address an incident, dealing with a large-scale security breach, etc.), it is no wonder that hero culture has become such a big part of security teams’ reality.
Security professionals work long hours, but because the amount of work is never-ending, and the rest of the company doesn’t celebrate the security team’s milestones (unlike product releases, revenue, and growth targets, etc.), people find it hard to find something they can be proud of at work. To compensate for that, they turn to their own, personal goals - studying for certifications, participating in capture the flag (CTF) competitions, and otherwise looking for ways to feel progress and solidify their stance in the community of peers. Cybersecurity is arguably the only private sector industry where you can find a growing number of medals, challenge coins, and other awards. Challenge coins, in particular, are unique to security. They came into the industry from the old military tradition, which then, through the close ties between the military and hacking communities, ended up becoming adopted in security. Since challenge coins are based on merits and personal achievements, their possession is a source of pride for many security practitioners globally.
Image Sources: tisiphone.net, Security Blue Team, Kevin DeLong/CyberSocialHub, Oneworldtreasures on Amazon, Thin Blue Line
This is a great example of how the cybersecurity community’s effort to recognize the best has created not just a great way to reward achievements, but also a peer network where people share their knowledge, help one another learn new skills, and provide support not available at the workplace. However, while on their own, each of these forms of recognition has a positive impact on the security community, when taken together, a culture of competition, and drive to be the first and outperform your peers reinforce the previously discussed hero culture. Moreover, the frequently discussed imposter syndrome is nothing else but what people who were not recognized as heroes by their community are forced to grapple with daily.
Fear: Living with a constant fear of failure and a drive for perfection
Cybersecurity practitioners are grappling with a constant fear of failure. Since adversaries are working 24/7 to break into the company’s environment, it is almost inevitable that sooner or later they will end up succeeding. When that happens, it won’t matter how much passion and effort everyone on the security team puts into their work, and how many sleepless nights the team has been through a few weeks before: everything except the latest incident will be immediately forgotten.
All this creates expectations that security teams get everything perfect, all the time. If anything falls through the cracks, security teams often get blamed for it, and even when they aren’t blamed, they are typically the ones working overtime to manage the crisis. This makes security practitioners hyper-vigilant about everything they do and incentivizes illogical behaviors. One example is when software engineers are provided a list of thousands of vulnerabilities to fix because every one of them could potentially be exploited, and the understaffed security engineering team doesn't have the capacity to do a full analysis of all of them to eliminate the false positives.
Hero culture is also reinforced for companies
Companies can also have a gambling addiction, but in this case, it is corporate executives and boards that are gambling with risk. Since risk is an ambiguous problem, there is always a hope that the company will be able to withstand another quarter without having to invest in security. The issue here is the incentive alignment. When the breach inevitably happens, the executives who rejected the CISO’s budget request don’t lose their jobs; it is the CISO who is often replaced.
From a business perspective, when a company understaffs the security team, what are the consequences and what are the odds something bad will happen this fiscal year? Turns out there aren’t a lot of real incentives for companies to do things differently. For example:
If we agree that hero culture leads to bad security outcomes, but consider that the consequences of a breach are quarterly earnings registering an average 7.5% decline, what is the motivation to invest more in security this quarter? How much can I save by putting it off a quarter? Two quarters?
If the security team is asking for an annual budget of $4M to prevent a theoretical $9.4M breach, do I feel lucky gambling that the breach won’t happen for at least 3 years? If I gamble right, the company saves money by not making the security investment until forced, and that money can be allocated in growth initiatives or shareholder value.
When security leaders and practitioners work 60-80 hours per week to protect their organizations, they enable their employer’s risk gambling addiction by providing free labor, further incentivizing businesses to ignore security and keep their investment in cyber defense low until an incident forces their hand. When deciding how much to invest in security, company executives are essentially making trade-offs about risk and quarterly earnings. What is not a part of the equation are people - security leaders and practitioners forced to work double time (or more) to deal with security incidents.
Net impact of hero culture
To be fair, a lot of the characteristics of what we define today as hero culture are precisely what allowed cybersecurity to evolve and mature. The desire to develop one’s hacking skills, get better, and win against an adversary which were critical parts of hacking’s origin, allowed security practitioners to grow their hands-on skills. The sense of mission and a readiness to sacrifice one’s own personal interests for the greater good of community enabled the protection of digital infrastructure even when nobody understood the importance of this protection.
Additionally, it has opened doors around the world for talented people to achieve respect and recognition for their skills and abilities regardless of where they are from, what accent they speak with, and how much money they have to invest in their training. All that matters is how much effort one is willing to put together, and how good they are in their craft.
Another benefit is that in an actual security emergency, having people who thrive in crisis is a tremendous asset to defense operations.
Unfortunately, hero culture is also responsible for many negative outcomes for both employees and their employers.
Hero culture has allowed businesses to avoid appropriately staffing their security organizations (why spend money for more staff if security practitioners will just work 70-hour weeks at no extra cost anyway). When companies understaff their security teams, they see an increase in unforced errors. In the case of product or IT security assurance, an understaffed team may not have time to threat model and security test each new feature or application that is deployed, leading to increased attack surface. That means that security response teams have even more ground to cover. Security attacks are an inevitability, but prevention and recovery are more difficult with an under-resourced and over-extended team that may miss critical problem areas and make mistakes.
Hero culture plays havoc with staffing economies for companies. By enabling understaffing, hero culture indirectly leads to teams that can’t afford to ramp up early in career staff and only open reqs for experienced security practitioners who can “hit the ground running”. But experienced security talent is in short supply on the market and commands a high salary, so that position may remain unfilled for months, while the team remains understaffed. When hiring managers do open “early in career” jobs, they often have unrealistic expectations that entry-level applicants are superhumans who are fully consumed by security, have twenty-five certifications, and a laundry list of skills, tools, and technologies on their resume instead of hiring someone with potential and helping them grow.
Hero culture is largely responsible for the epidemic of burnout, addictive behaviors, and substance abuse in the security community. When people define their worth as humans by the work they do, it changes their sense of identity and makes them less resilient to dealing with challenges real life is throwing at them. Burned-out staff take more sick days, exhibit diminished team morale, and may leave the company for the promise of a healthier work environment elsewhere.
Hero culture is counterproductive to building a cohesive company, as it creates an “us vs them” mentality between security teams and the rest of the employees. In his talk on this topic, George Sandford highlights another reason why hero culture is damaging to security. Heroes need villains, and this causes security teams to distance themselves both from the adversaries and those they are hired to protect. Seeing everyone in an organization as “the weakest link in security” creates the “us vs. them” mentality, siloing security as a special business function that holds sacred knowledge and has the power to grant approvals for tools, workflows, and processes of other departments.
Hero culture often reinforces unrealistic expectations of perfection by security teams, preventing every possible error the engineering organization may make. So they over-correct and turn into the “House of No”. No one likes being told no all the time, so people in an organization just work around the security restrictions and avoid the security team - leading to more missed risks and perpetuating the hero culture crisis.
Last but not least, hero culture incentivizes people on security teams to hoard knowledge and seek to show themselves in a great light instead of working together as a team. This creates the so-called “brilliant jerks” - people who are impossible to work with, but who are tolerated and rewarded because of the knowledge and experience they bring to the table.
Instead of investing in growing their security teams, improving their practices, and getting better tooling, businesses rely on security teams’ cravings to have them work long hours and lose work-life balance to keep their companies protected. And, it is not surprising that people’s drive to defend their organization and find signs of malicious behavior creates a sense of mission that feeds hero culture in the industry.
Going into the future
I don’t know how to compare the benefits of hero culture to its deeply detrimental costs to people and businesses. The real math cannot be done on the level of community feelings; the equation will look different depending on who is looking at this problem and how their life or the lives of their loved ones have been affected by it. There simply hasn’t been enough research done on the business impact of the hero culture phenomenon and resulting burnout in cybersecurity, but we can start by learning from studies about the effects of burnout on Emergency Medicine professionals.
So where do I hope to see us go from here?
As we go into the future, I am first and foremost hoping that there will be more data-driven research and discussion about the impact of hero culture not only on security professionals' health but on companies' ability to protect their customers' data. So far, I was only able to find two talks on this topic, both by George Sandford (you can watch his presentation at the Diana Initiative 2023 conference here: Don’t Get Tangled Up in Your Cape: Hero Culture as a Negative Force in Cyber). We need much more - more research, more data, and more discussions about this problem.
Second, we need to continue maturing the security space and our ability to measure and communicate cost/value. It is akin to building a company: when there are 20 people, it makes sense that they will be self-driven and work hard to achieve the desired outcome, even if the pay is low and there’s nobody else to ask for help. But, as the business grows, it becomes important to put the right systems and processes in place, so that the company can scale. Since cybersecurity started in communities of hackers, it makes sense that the origins of the field were built on the sense of mission and the sense of purpose of early practitioners. Now that security has become a business need for every company, the industry has grown too big to treat it like a hobby.
In the past, when security was seen as an esoteric profession, it made sense for companies to hire heroes and ask them to do their best to keep the company protected. Fast forward to today, we have accumulated a solid knowledge base, formalized a lot of our practices, and developed great avenues for knowledge sharing such as industry conferences, Information Sharing and Analysis Centers, and the Cybersecurity and Infrastructure Security Agency, to name some. We should continue building systems and processes for hiring, training, and upskilling talent, measuring the security posture, benchmarking one organization against another, and so on. As the saying goes, what got us here won’t get us there. As it stands, hero culture is preventing us from being able to evolve the industry, and that is why it needs to go.
Third, businesses need to continue investing in security. In 2024, it is no longer acceptable to see security practitioners be overworked, overextended, and as a result, underpaid given the hours they’re working. It happens too often that companies cut their already under-resourced security teams, expecting those who stay to pick up the work of their colleagues and do the same amount of work, but with fewer resources. To make it easier to justify more investment in cyber defenses, the industry needs to continue moving from promise-based to evidence-based security. One of the challenges today is that there is no easy way to empirically prove that investing more money or buying an extra tool is going to have a material impact on an organization's security posture. As the industry matures, I am optimistic that it will become easier to communicate the value of security, and that more companies will see it as a business enabler, competitive differentiator, and protector of shareholder value.
It will take time to change this toxic relationship. Security professionals need support to detox from their work addictions and develop sustainable fulfilling careers that don’t consume all of their energy like a black hole. Companies need to invest more effort in forecasting the cost of their risk gambling. In addition to reducing errors and adding capacity for business continuity, adding staff to a security team will give your organization the breathing room to hire some early-in-career security folks and train them up - an increase in supply of experienced security professionals will improve your business resilience, shift the supply/demand hiring economics for highly qualified security professionals, and flatten your security headcount spending.
This is an incredibly important piece! Thank you for writing/publishing it.
well written. I had no idea how some of these things were accepted so much