The government's role in shaping the future of cybersecurity
Looking at the government's role as a market maker, recruiter, customer, talent supplier, and a regulator at the same time
Welcome to Venture in Security! Before we begin, do me a favor and make sure you hit the “Subscribe” button. Subscriptions let me know that you care and keep me motivated to write more. Thanks folks!
Thanks for supporting Venture in Security!
The unique relationship between the government and cybersecurity
Cybersecurity is a unique field in many different ways, and one of them is certainly its relationship with the government. The government is a recruiter, a customer, a talent supplier, and a regulator at the same time.
The way the government sees cybersecurity has evolved dramatically over the past two decades. Similarly to how Microsoft went 180 degrees with its view of open source, so did the government with its relationship with the cyber world: from seeing it as an enemy at all times to having recruitment booths of CIA, NSA, and Secret Services at events such as Black Hat and DefCon. Initially, cybersecurity has received little attention on the national level. That changed on September 11, 2001, when the terrorist attack carried out by the members of al Qaeda made it clear that the US government and Western values are under attack. For the next decade, security was seen as an infrastructure threat, and many security practitioners started to gain a voice and a seat at the table. The emergence and the rise of ransomware in the mid-2010's which coincided with and was to a large degree propelled by the growth of cryptocurrency, attracted a lot of attention from executives. It’s worth mentioning that since ransomware actors aren’t typically actively targeting the military and government targets, the IT modernization in the public sector started to lag. The government which once pioneered and led cybersecurity efforts has become a laggard in the game it started.
Career mobility between the public and private sectors is impressive: it is hard to find another industry where the government experience would offer a great stepping stone for a leadership role in the private sector. In cybersecurity, these transitions are routinely common, and alumni of select organizations such as the NSA, CIA, and FBI are often highly desired by businesses looking for experienced security leaders.
The government holds a monopoly on force, and cybersecurity is not an exception: NSA, for example, is seen as one of the few dream places for technical red-teamers. As a part of the defense budget, the government pumps significant capital into R&D as well as offensive and defensive cyber warfare capabilities. Interestingly, even though Fort Meade is light years ahead of the private sector when it comes to offense, the vast majority of the vulnerabilities seem to be disclosed by the researchers, not the government.
As a regulator, the government can set and enforce standards, and define what constitutes an acceptable level of security. Cybersecurity vendors tend to support, and actively lobby for new security standards and stricter regulations, as those help customers prevent cyber incidents, and as a byproduct of that - result in increased demand for their products and services.
As a customer, the government is purchasing products and services of security providers at scale. Companies like Google, Mandiant (now a part of Google), and Microsoft are actively working to expand their already deep relationships with the federal government.
There is one more factor that makes security employees from the government, as well as those protecting company networks and even working for cybersecurity vendors, feel the same, and that is the sense of mission that unites everyone on the side of the defense. On one hand, cybersecurity is a part of the broader tech market, where investors deploy capital, founders looking to grow their companies, sales teams are incentivized to get everyone to use their firm’s products, and so on. On the other hand, cybersecurity is also a public service, essential to the normal functioning of society. Most cybersecurity enthusiasts who spend countless hours at events, and community meetups, mentoring aspiring security talent, and evangelizing the importance of cyber hygiene, do not do it for the money. It is a sense of mission, a sense of service similar to that seen in the military, and sometimes - in the public sector, that propels security forward.
The government’s role in shaping the future of cybersecurity
The role the government should play in cybersecurity is a highly debated topic. The complexity lies in the fact that many different lenses could be used to argue the case: for example, if we draw parallels with warfare, then having a “cyber military” defending the digital world naturally becomes a possibility. In this piece, I look at one of the many ways to define the government’s role in cybersecurity and explain why specific aspects of it would or would not make sense. Note that while I will often be using examples from the US, the same approaches and ideas are equally valid for the governments in other countries.
Securing itself: the critical need for safeguarding government data
Before we discuss the role of the government in helping other members of society solidify their defenses, it is critical to emphasize that first and foremost, it needs to secure itself. The government holds treasure troves of data about its citizens, critical infrastructure, trade secrets, as well as other information safeguarding which is both a matter of national security and an important part of maintaining a high level of citizen trust.
In 2022, it was reported that Iranian government-sponsored malicious actors compromised the networks of the US federal government agency and installed cryptocurrency mining software on the agency's computers. Earlier in 2020, a massive computer breach enabled hackers to explore several US government networks for months, including those of the Department of the Treasury and the Department of Commerce. A few years before, in 2015, the hack of the Office of Personnel Management compromised the data of over 20 million Americans; attackers obtained personally identifiable information such as names, Social Security numbers, dates and places of birth, and addresses of the government employees and people who had undergone background checks. This is just a brief snapshot showing that governments (not just in the US but all over the world) are vulnerable to attacks that endanger their citizens and raise alarms about matters of national security.
Local governments, and public and municipal institutions, such as schools, emergency services, and hospitals, are especially vulnerable because of the lack of resources. The federal government has been taking steps to promote sound IT and cybersecurity practices, but the work in that direction is just starting. Bureaucracy hinders the ability to decide what needs to be done beyond what’s regulated, which is not enough (I have previously discussed the “compliance VS security” topic in depth).
Setting the rules of the game: legislating cybersecurity
Because the private sector focuses on maximizing shareholder value and increasing profits, without appropriate regulation and mechanisms for enforcement it will typically look to implement the minimum measures which allow it to achieve these goals. There is a shared understanding among security leaders, practitioners, and policymakers that we need to do better when it comes to cybersecurity. Businesses of all sizes need to make sure they implement measures that can protect them from cyber attacks.
Today, the government is the market maker: by legislating cybersecurity requirements, it produces the demand for new solutions. Security vendors are happy to lobby new regulations, frameworks, and compliance requirements because they help sell more products. The flow goes as follows: breaches lead to lobbying for new regulations, and this lobbying translates into legislative requirements, which in turn drive demand for cybersecurity.
Not every organization has the right mindset and is willing to allocate sufficient resources to make this a reality, and having the government mandate a minimum level of security surely looks like a viable way to solve the problem. The challenge is that mandating minimum measures has proven to result in “box checking” as it typically happens with various types of “compliance requirements” when the spirit and the initial intent are lost in the desire to feature a “compliant” badge on the marketing site. Furthermore, the government does not have enough resources to assess the implementation of its security requirements, making it the responsibility of the company to do so internally. Where having a minimum set of security measures could work is when it comes to determining the size of the fines companies are required to pay when customer information is exposed. If it is determined that the mandated security checks and balances were not properly implemented, which has resulted in or contributed to the security exposure, affected businesses can be ordered to pay substantially more than they would otherwise.
Issuing directives and recommending standards is just one of the areas the government should look at. Additionally, there is the need to decide what parts of cybersecurity should be legislated and what shouldn’t, what constitutes a cyber offense, and so on. Lawmakers must stay on top of new trends in the rapidly-changing cyberspace, making sure that the nation's legislation is always kept up to date.
Creating conditions for market self-regulation
It is not effective to solely rely on regulations to promote cyber maturity, and most importantly - the government lacks resources to effectively enforce it. Fortunately, there is a model that has proven to be effective in other areas of the market, namely intentionally creating conditions for self-enforcement and self-regulation. The effective ways to do this include creating the continuous threat of regulation and designing mechanisms for cyber insurance.
Threatening businesses with the introduction of new regulatory requirements tends to motivate them to improve their security measures and do what is needed to avoid having the government meddle with their affairs. Surely, not every organization does it, but large and public enterprises tend to react in that exact way. In many instances, taking this path may prove more effective than legislating long lists of requirements businesses are expected to comply with only to create a culture of checking the compliance boxes.
The other area where the government can make a tangible impact is by supporting the growth of cyber insurance. I have previously discussed the critical role insurance can play in regulating the market and continuously leveling up the organization’s understanding of cyber maturity. The problem with cyber insurance is that it is not yet possible to build an understanding of the degree to which the potential losses may be correlated, and therefore - avoid massive payouts leading to bankruptcy. Most other types of risks that may lead to a massive loss for insurance companies can be managed or reinsured. Think about earthquakes: no insurance company would agree to cover all buildings in an area with a high risk of natural disasters. With cyber, it’s not possible to predict to what degree a catastrophic event would affect all of some of the company’s policyholders: a dental office in Tokyo, an accounting company in London, a factory in Mexico City and a fire station in New York can all be affected by the same cyber attack at the same time.
In a previous deep dive, I argued that “...before 9/11, terrorism coverage was typically included as a part of general insurance coverage at no extra cost to the insureds. After the attacks, many insurance companies started excluding it entirely, and those that didn’t, ended up increasing premiums so much that it became prohibitively expensive. In response to that, the U.S. Congress in 2022 passed the Terrorism Risk Insurance Act, which has since been renewed four times: 2005, 2007, 2015, and 2019. This federal program enables the US government to share monetary losses to commercial policyholders with insurance companies (up to $100 billion), making it possible for businesses to purchase terrorism coverage. Without government support, likely, we would not have had many providers willing to insure this type of risk. The government could take a similar approach to cyber insurance as well”.
The government can indeed create a similar program to provide more confidence to insurers and encourage them to underwrite cyber risk. This, however, has to be done in conjunction with establishing clear boundaries and helping cyber insurance providers develop deeper technical expertise in cybersecurity. Today, few insurance companies truly understand what they are underwriting, what kinds of attack vectors exist in different environments, and what can reduce an organization's likelihood of suffering a cyber loss, and these gaps, if not addressed, will continue to cause easily preventable mistakes. Government support has to be designed in a way to provide support in case of catastrophic black swan events, not to create a moral hazard by guaranteeing bailouts of bad business practices and reasonably foreseeable losses.
Another way the government can help the insurance industry to mature and by doing so - accelerate the maturation of cybersecurity is by supporting research designed to build quantitative models for cyber risk. Initiatives such as Cyber Catalyst by Marsh and FAIR Institute have made some progress, but we are still far away from being able to model cyber risk in the same way, and with the same accuracy we can model other types of exposure. Given the fact that the government has access to treasure troves of information about cyber attacks, it is logical that it can help solve this problem.
Facilitating the collaboration and providing infrastructure for intelligence sharing
Crafting regulations and setting minimum standards for companies and organizations to comply with is not enough to increase the maturity of the private sector’s cyber defenses. Given the number of unique risks faced by businesses of different sectors, geographies, sizes, and industry verticals, the government cannot hope to provide actionable and relevant recommendations for every company. What it does have the ability to do is facilitate collaboration between different organizations, and enable knowledge sharing among them. This is especially relevant as the behaviors of industry leaders tend to be mirrored by others in the segment, making it possible to disseminate best practices organically, through knowledge sharing and collaboration.
In a world where everything is interconnected, having the government act with the intention to design a platform for collaboration is both critical and an easy sell: everyone benefits from having those around them get stronger as we are as secure as the weakest link. There are a multitude of examples that highlight the interconnected nature of critical infrastructure and the impact a system outage may have on our lives; a case in point is the FAA system outage in January 2022 which disrupted thousands of flights across the US.
An example of successful models when it comes to cyber threat information sharing are Information Sharing and Analysis Centers (ISACs) and Information Sharing and Analysis Organizations (ISAOs). ISACs were initially established in 1998 under a presidential directive to enable owners and operators of critical infrastructure to exchange information about cyber threats and security best practices. The National Council of ISACs lists 25 of them, operating in a variety of sectors from healthcare and space to automotive. ISAOs, on the other hand, are voluntary organizations established under the executive order in 2015 to encourage cyber threat information sharing among organizations that do not belong to ISACs or have unique needs that ISACs cannot address.
The government itself can share cybersecurity threat intelligence with security teams and service providers, making it easier to understand the threat landscape and get the upper hand against attackers. We have seen some successful examples of this such as the Automated Indicator Sharing (AIS), a capability by the Cybersecurity and Infrastructure Security Agency (CISA). AIS makes it easy to exchange cyber threat indicators and defensive measures among federal and non-federal entities in real time, thus helping defenders to protect their networks against commodity indicators.
Helping build the innovation ecosystem
The government plays a unique role in helping to build the innovation ecosystem for cybersecurity.
The country that showed the rest of the world what could be possible is Israel which now has the highest number of startups per capita. Israel’s military is the country’s most prolific startup incubator and accelerator, with the alumni of the famous cyber-warfare-focused Unit 8200 alone starting tens of cybersecurity companies every year. While many factors that have led to this success such as fostering public-private partnerships, attracting capital, and motivating entrepreneurs to reinvest in their own community, can be replicated by other nations, some can’t. When Israel was established in 1948, the country was under constant threat which forced it to invest in developing superior military capabilities, and over time Israel's Defense Forces evolved into the catalyst of the nation’s economy without it being the initial intent. Nevertheless, many lessons from Israel’s model can be adopted by other countries.
One way to support new ideas is to fund cybersecurity research, especially that focused on securing critical infrastructure. In 2022, the US Department of Energy (DOE) announced $12 million for six new research, development, and demonstration (RD&D) projects designed to build innovative cyber defense technology to protect America’s power grid from the ever-increasing number of attacks. By funding and supporting scientific discovery, the government fosters novel thinking and ensures that the country will be prepared for the impact of quantum computing and other game-altering technologies on the security of tomorrow.
To encourage innovation in cybersecurity, it is critical to attract capital and motivate investors to come in early. The challenge is that cybersecurity is a deeply technical discipline, and it may take five to ten years or even longer to commercialize new inventions - more time than what a traditional VC can afford. This is where the government may come in with projects such as In-Q-Tel, the taxpayer-funded venture capital firm that has been quite successful in supporting defense-focused innovation. If replicated for cybersecurity, it can help de-risk early-stage security technology, therefore attracting more investors in the space. For those interested in the topic of funding early-stage cyber-focused innovation, I have previously written a deep dive about running accelerators and incubators in the industry.
Funding research, attracting venture capital into the industry, fostering partnerships, and working with the vendor market, and other initiatives, while important, won’t on their own lead to a thriving ecosystem without an important ingredient: the presence of cybersecurity talent. The US government has been taking steps to tackle this problem: in 2022, it announced a plan to create hundreds of registered apprenticeship programs with companies from the private sector to help train the next generation of cybersecurity professionals. As I discussed before, last year “we have seen the continued work from the government side to build awareness of cyber careers and take actionable steps to close the gap including the collaboration between the Department of Commerce, Department of Labour, and Department of Education. We are also seeing the growth of cybersecurity programs offered by universities, community colleges, bootcamps, and other training providers, initiatives to reskill the veterans and get them ready for careers in security, the growth in the number of scholarships, and so on”. A lot more needs to be done in this area, including helping people develop cybersecurity skills from a young age; a good model for this is Israel, the only country in the world in which cybersecurity is an elective in high school.
“Defending forward” with the active military
A much more controversial question concerns the role of the military in cybersecurity. Traditionally, the government has been responsible for law enforcement and security services to safeguard its citizens from internal and external threats. Internal defense is accomplished by police and intelligence services, while external (international) - by the military, military alliances, and the like. The role of the military in cybersecurity isn’t entirely obvious. However, as Barack Obama stated in 2016, “Now, more and more, keeping America safe is not just a matter of more tanks, more aircraft carriers, not just a matter of bolstering our security on the ground. It also requires us to bolster our security online. As we've seen in the past few years and just in the past few days, cyber threats pose a danger not only to our national security but also our financial security and the privacy of millions of Americans.”
There are two dimensions across which the role of the military in cyberspace can be looked at: in the context of protecting its networks and data, and in the context of protecting the country at large. While the former is not controversial, given that every entity, especially the one that relies on its ability to preserve advantage by limiting access to the information, must protect its data, the latter is much more complex. The 2018 US Department of Defense Cyber Strategy, for instance, states that among other initiatives, “The United States seeks to use all instruments of national power to deter adversaries from conducting malicious cyberspace activity that would threaten U.S. national interests, our allies, or our partners… The Department will work with its interagency and private sector partners to reduce the risk that malicious cyber activity targeting U.S. critical infrastructure could have catastrophic or cascading consequences.”
The extent to which the military should be involved in helping the country secure its networks is heavily debated. On one hand, as Ian Wallace observed, “Any country that depends too heavily on the military for cybersecurity will likely find itself reducing the incentives for the private sector to develop longer-term solutions.” On the other hand, there are cases when not interfering could lead to significant damage to the nation’s economy and critical infrastructure. The denial of service attacks against US banks in 2012–2013, also known as Operation Ababil, is a good example of the latter, and so are the attempts of foreign states to interfere in the US elections.
The US Congress has been taking active steps to define the role of the military in this picture, known as “defending forward”, which involves gaining access to networks and systems where adversaries operate, tracking their activities, and stepping in when required if there is a threat to the US critical infrastructure or national security. As we go into the future, the role of the military in cyber warfare will evolve, although it will likely remain rooted in the effective collaboration between the government, the law enforcement agencies, the nation’s military forces, and the private sector - security vendors, service providers, and customers. It’s worth noting that while I have been using the US as an example, the same will apply to other nations across the globe.
Helping overcome the talent shortage by retraining veterans
Hiring veterans on security teams can help close the talent gap as they have real battlefield experience, and many have been trained in cybersecurity. About 200,000 servicemembers transition to civilian life each year, according to the US Department of Defense. Brian NeSmith, CEO of Arctic Wolf Networks, notes that “...many veterans’ programs are promoting opportunities in the industry and providing cybersecurity training and certifications to a growing number of interested veterans.” People who have had real-world experience on the battlefield are well-positioned to think like an adversary and approach security from the standpoint of the fundamentals, as opposed to relying on security products to do the job for them. Many of them have had great experiences with cutting-edge cybersecurity tools and technologies. More importantly than credentials and experience, ex-service members have the right attitude - a sense of duty, strong work ethic, problem-solving skills, the ability to work under pressure, and a mission-driven mentality needed in cybersecurity.
To help ex-service members transition into cybersecurity roles in the private sector, the government can support them with training and education. We have seen several initiatives directed on exactly that: CISA provides Cybersecurity Training and Education for Veterans: A user guide for those who formerly served in the U.S. Armed Forces, as well as The Federal Virtual Training Environment (FedVTE), while the Commonwealth of Virginia launched Cyber Vets Virginia. Aside from the initiatives the government runs on its own, it should look for opportunities to collaborate with not-for-profit initiatives and training providers trying to accomplish the same goal, such as VetSec, Inc., and TechVets.
In the past few years, leaders of developing countries have been forced to not just pay attention to cybersecurity but to start playing an active role in how it is done. And, in the years to come, as cyber becomes a cornerstone of national security, we will see even more development in the industry. The NIS Directive has proven to be an effective framework for the maturation of security in the EU, and we are seeing more and more effective collaboration between different parts of the government in the US.
The role of the government in cybersecurity has shifted from protecting its networks and acting as a regulator, to being a creator and an active facilitator of the ecosystem. The Home Depot approach - “you can do it, we’re here to help” - is evolving into a more proactive one. While there are many examples of how it is done in the US, and the White House has been evolving its cyber policies for 25 years, the model itself extends beyond one country and is being adopted globally, and includes the following examples:
The Singapore Cybersecurity Consortium (SGCSC) is created “for engagement between industry, academia and government agencies to encourage use-inspired research, translation, manpower training and technology awareness in cybersecurity”.
In Ireland, the National Cyber Security Centre (NCSC) is responsible for “advising and informing Government IT and Critical National Infrastructure providers of current threats and vulnerabilities associated with network information security”.
PLDT and its wireless unit Smart Communications, Inc. work with the Philippine Air Force to boost the country’s cyber defense capabilities.
As different people working in cybersecurity come from different backgrounds - private sector, public service, academia, military, regulatory bodies, insurance, and the like, the way they see the government’s role in the industry will be different. I see its role as that of an enabler - facilitating information sharing, fostering collaboration, helping educate companies and organizations about the best practices for defending their environment, mandating that the operators of critical infrastructure build security into their day-to-day lives, and establishing rules of the game.